ansible-playbook [core 2.16.18] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ysS executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Monday 20 April 2026 15:48:09 -0400 (0:00:00.273) 0:00:00.273 ********** ok: [managed-node16] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Monday 20 April 2026 15:48:13 -0400 (0:00:03.942) 0:00:04.216 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Monday 20 April 2026 15:48:13 -0400 (0:00:00.184) 0:00:04.400 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Monday 20 April 2026 15:48:13 -0400 (0:00:00.078) 0:00:04.479 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Monday 20 April 2026 15:48:13 -0400 (0:00:00.165) 0:00:04.644 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Monday 20 April 2026 15:48:13 -0400 (0:00:00.162) 0:00:04.807 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Monday 20 April 2026 15:48:13 -0400 (0:00:00.119) 0:00:04.927 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Monday 20 April 2026 15:48:14 -0400 (0:00:00.176) 0:00:05.103 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Monday 20 April 2026 15:48:14 -0400 (0:00:00.116) 0:00:05.220 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:48:14 -0400 (0:00:00.076) 0:00:05.296 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:48:14 -0400 (0:00:00.019) 0:00:05.316 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:48:14 -0400 (0:00:00.066) 0:00:05.382 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:48:14 -0400 (0:00:00.302) 0:00:05.684 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:48:14 -0400 (0:00:00.052) 0:00:05.737 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:48:16 -0400 (0:00:01.831) 0:00:07.569 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:48:18 -0400 (0:00:01.747) 0:00:09.316 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:48:18 -0400 (0:00:00.304) 0:00:09.621 ********** ok: [managed-node16] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:48:20 -0400 (0:00:01.507) 0:00:11.128 ********** ok: [managed-node16] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:48:20 -0400 (0:00:00.139) 0:00:11.268 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:48:20 -0400 (0:00:00.083) 0:00:11.351 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:48:20 -0400 (0:00:00.117) 0:00:11.469 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:48:20 -0400 (0:00:00.290) 0:00:11.759 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:48:21 -0400 (0:00:00.199) 0:00:11.958 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:48:21 -0400 (0:00:00.121) 0:00:12.080 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:48:26 -0400 (0:00:05.389) 0:00:17.469 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:48:26 -0400 (0:00:00.177) 0:00:17.647 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:48:26 -0400 (0:00:00.189) 0:00:17.837 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:48:30 -0400 (0:00:03.220) 0:00:21.057 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:48:30 -0400 (0:00:00.149) 0:00:21.207 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:48:30 -0400 (0:00:00.045) 0:00:21.253 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:48:30 -0400 (0:00:00.130) 0:00:21.383 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:48:30 -0400 (0:00:00.088) 0:00:21.472 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:48:33 -0400 (0:00:03.420) 0:00:24.892 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:48:37 -0400 (0:00:03.411) 0:00:28.304 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:48:37 -0400 (0:00:00.248) 0:00:28.553 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:48:39 -0400 (0:00:01.613) 0:00:30.167 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:48:39 -0400 (0:00:00.112) 0:00:30.279 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776713993.8599024, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776713992.6419075, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776713992.6419075, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:48:40 -0400 (0:00:00.969) 0:00:31.249 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:48:40 -0400 (0:00:00.098) 0:00:31.348 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:48:40 -0400 (0:00:00.280) 0:00:31.629 ********** ok: [managed-node16] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:48:40 -0400 (0:00:00.171) 0:00:31.800 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:48:41 -0400 (0:00:00.187) 0:00:31.987 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:48:41 -0400 (0:00:00.206) 0:00:32.194 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:48:41 -0400 (0:00:00.190) 0:00:32.384 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:48:41 -0400 (0:00:00.186) 0:00:32.570 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:48:41 -0400 (0:00:00.218) 0:00:32.788 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:48:41 -0400 (0:00:00.125) 0:00:32.914 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:48:42 -0400 (0:00:00.170) 0:00:33.085 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776713441.1001608, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:48:43 -0400 (0:00:01.131) 0:00:34.216 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:48:43 -0400 (0:00:00.055) 0:00:34.272 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:48:44 -0400 (0:00:01.527) 0:00:35.800 ********** ok: [managed-node16] => { "changed": false } TASK [Get unused disks] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:75 Monday 20 April 2026 15:48:45 -0400 (0:00:00.965) 0:00:36.765 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node16 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Monday 20 April 2026 15:48:46 -0400 (0:00:00.203) 0:00:36.969 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Monday 20 April 2026 15:48:49 -0400 (0:00:03.362) 0:00:40.331 ********** ok: [managed-node16] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Monday 20 April 2026 15:48:56 -0400 (0:00:07.191) 0:00:47.523 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Monday 20 April 2026 15:48:56 -0400 (0:00:00.102) 0:00:47.625 ********** ok: [managed-node16] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Monday 20 April 2026 15:48:56 -0400 (0:00:00.146) 0:00:47.771 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Monday 20 April 2026 15:48:56 -0400 (0:00:00.138) 0:00:47.910 ********** ok: [managed-node16] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:84 Monday 20 April 2026 15:48:57 -0400 (0:00:00.098) 0:00:48.009 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:48:57 -0400 (0:00:00.241) 0:00:48.250 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:48:57 -0400 (0:00:00.125) 0:00:48.375 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:48:57 -0400 (0:00:00.232) 0:00:48.607 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:48:57 -0400 (0:00:00.001) 0:00:48.609 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:48:57 -0400 (0:00:00.229) 0:00:48.838 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:48:58 -0400 (0:00:00.166) 0:00:49.005 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:48:58 -0400 (0:00:00.157) 0:00:49.162 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:49:00 -0400 (0:00:02.001) 0:00:51.163 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:49:01 -0400 (0:00:00.822) 0:00:51.985 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:49:01 -0400 (0:00:00.474) 0:00:52.460 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:49:01 -0400 (0:00:00.146) 0:00:52.607 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:49:01 -0400 (0:00:00.117) 0:00:52.725 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:49:01 -0400 (0:00:00.129) 0:00:52.854 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:49:02 -0400 (0:00:00.126) 0:00:52.981 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:49:02 -0400 (0:00:00.343) 0:00:53.324 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:49:02 -0400 (0:00:00.210) 0:00:53.535 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:49:02 -0400 (0:00:00.156) 0:00:53.692 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:49:06 -0400 (0:00:03.440) 0:00:57.133 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:49:06 -0400 (0:00:00.218) 0:00:57.351 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:49:06 -0400 (0:00:00.155) 0:00:57.507 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:49:11 -0400 (0:00:04.921) 0:01:02.428 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:49:11 -0400 (0:00:00.149) 0:01:02.578 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:49:11 -0400 (0:00:00.121) 0:01:02.699 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:49:11 -0400 (0:00:00.075) 0:01:02.775 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:49:11 -0400 (0:00:00.123) 0:01:02.899 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:49:15 -0400 (0:00:03.415) 0:01:06.314 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:49:17 -0400 (0:00:02.605) 0:01:08.919 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:49:18 -0400 (0:00:00.407) 0:01:09.327 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:49:23 -0400 (0:00:05.006) 0:01:14.333 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:49:23 -0400 (0:00:00.221) 0:01:14.555 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:49:23 -0400 (0:00:00.284) 0:01:14.840 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:49:24 -0400 (0:00:00.227) 0:01:15.067 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:49:24 -0400 (0:00:00.279) 0:01:15.347 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:100 Monday 20 April 2026 15:49:24 -0400 (0:00:00.159) 0:01:15.507 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:49:24 -0400 (0:00:00.236) 0:01:15.743 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:49:24 -0400 (0:00:00.001) 0:01:15.744 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:49:25 -0400 (0:00:00.210) 0:01:15.955 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:49:25 -0400 (0:00:00.167) 0:01:16.122 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:49:25 -0400 (0:00:00.122) 0:01:16.245 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:49:27 -0400 (0:00:02.423) 0:01:18.668 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:49:28 -0400 (0:00:01.233) 0:01:19.902 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:49:29 -0400 (0:00:00.459) 0:01:20.362 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:49:29 -0400 (0:00:00.182) 0:01:20.544 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:49:29 -0400 (0:00:00.222) 0:01:20.767 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:49:30 -0400 (0:00:00.206) 0:01:20.973 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:49:30 -0400 (0:00:00.209) 0:01:21.184 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:49:30 -0400 (0:00:00.475) 0:01:21.660 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:49:30 -0400 (0:00:00.127) 0:01:21.788 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:49:31 -0400 (0:00:00.205) 0:01:21.993 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:49:34 -0400 (0:00:03.313) 0:01:25.307 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:49:34 -0400 (0:00:00.108) 0:01:25.415 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:49:34 -0400 (0:00:00.240) 0:01:25.656 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:49:39 -0400 (0:00:04.891) 0:01:30.548 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:49:39 -0400 (0:00:00.141) 0:01:30.690 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:49:39 -0400 (0:00:00.204) 0:01:30.895 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:49:40 -0400 (0:00:00.141) 0:01:31.036 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:49:40 -0400 (0:00:00.123) 0:01:31.160 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:49:43 -0400 (0:00:03.248) 0:01:34.408 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:49:45 -0400 (0:00:02.159) 0:01:36.567 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:49:45 -0400 (0:00:00.076) 0:01:36.644 ********** changed: [managed-node16] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:49:58 -0400 (0:00:12.718) 0:01:49.363 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:49:58 -0400 (0:00:00.111) 0:01:49.474 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776713993.8599024, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776713992.6419075, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776713992.6419075, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:49:59 -0400 (0:00:01.031) 0:01:50.506 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:50:01 -0400 (0:00:02.112) 0:01:52.618 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:50:01 -0400 (0:00:00.305) 0:01:52.924 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:50:02 -0400 (0:00:00.139) 0:01:53.063 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:50:02 -0400 (0:00:00.185) 0:01:53.249 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:50:02 -0400 (0:00:00.148) 0:01:53.397 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:50:02 -0400 (0:00:00.154) 0:01:53.551 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:50:05 -0400 (0:00:02.712) 0:01:56.264 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:50:07 -0400 (0:00:02.510) 0:01:58.775 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:50:07 -0400 (0:00:00.139) 0:01:58.914 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:50:09 -0400 (0:00:01.457) 0:02:00.372 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776713441.1001608, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:50:10 -0400 (0:00:00.874) 0:02:01.247 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-72e0c453-925f-4c37-ac54-222606f3f95e', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:50:11 -0400 (0:00:01.333) 0:02:02.580 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:50:13 -0400 (0:00:01.820) 0:02:04.401 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:112 Monday 20 April 2026 15:50:14 -0400 (0:00:01.117) 0:02:05.518 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:50:14 -0400 (0:00:00.260) 0:02:05.779 ********** skipping: [managed-node16] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:50:14 -0400 (0:00:00.112) 0:02:05.892 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:50:15 -0400 (0:00:00.229) 0:02:06.121 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "size": "10G", "type": "crypt", "uuid": "a5b0cd4c-1bae-45e4-befc-7558397aa10b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "72e0c453-925f-4c37-ac54-222606f3f95e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:50:17 -0400 (0:00:02.264) 0:02:08.386 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003102", "end": "2026-04-20 15:50:19.714346", "rc": 0, "start": "2026-04-20 15:50:19.711244" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:50:19 -0400 (0:00:02.424) 0:02:10.811 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002773", "end": "2026-04-20 15:50:20.837937", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:50:20.835164" } STDOUT: luks-72e0c453-925f-4c37-ac54-222606f3f95e /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:50:21 -0400 (0:00:01.136) 0:02:11.948 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:50:21 -0400 (0:00:00.189) 0:02:12.165 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:50:21 -0400 (0:00:00.214) 0:02:12.379 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:50:21 -0400 (0:00:00.213) 0:02:12.593 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:50:23 -0400 (0:00:01.420) 0:02:14.013 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:50:23 -0400 (0:00:00.187) 0:02:14.201 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:50:23 -0400 (0:00:00.291) 0:02:14.492 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:50:23 -0400 (0:00:00.281) 0:02:14.774 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:50:24 -0400 (0:00:00.184) 0:02:14.958 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:50:24 -0400 (0:00:00.257) 0:02:15.216 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:50:24 -0400 (0:00:00.263) 0:02:15.479 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:50:24 -0400 (0:00:00.200) 0:02:15.679 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:50:24 -0400 (0:00:00.125) 0:02:15.805 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:50:25 -0400 (0:00:00.127) 0:02:15.933 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:50:25 -0400 (0:00:00.244) 0:02:16.177 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:50:25 -0400 (0:00:00.220) 0:02:16.397 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:50:26 -0400 (0:00:00.530) 0:02:16.928 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:50:26 -0400 (0:00:00.241) 0:02:17.169 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:50:26 -0400 (0:00:00.377) 0:02:17.546 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:50:26 -0400 (0:00:00.209) 0:02:17.756 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:50:27 -0400 (0:00:00.228) 0:02:17.984 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:50:27 -0400 (0:00:00.288) 0:02:18.273 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:50:27 -0400 (0:00:00.364) 0:02:18.637 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:50:28 -0400 (0:00:00.387) 0:02:19.024 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714598.0792794, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714598.0792794, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714598.0792794, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:50:29 -0400 (0:00:01.617) 0:02:20.642 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:50:29 -0400 (0:00:00.229) 0:02:20.872 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:50:30 -0400 (0:00:00.232) 0:02:21.104 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:50:30 -0400 (0:00:00.298) 0:02:21.403 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:50:30 -0400 (0:00:00.281) 0:02:21.684 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:50:30 -0400 (0:00:00.155) 0:02:21.839 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:50:31 -0400 (0:00:00.138) 0:02:21.978 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714598.2202787, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714598.2202787, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 178005, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714598.2202787, "nlink": 1, "path": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:50:32 -0400 (0:00:01.352) 0:02:23.330 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:50:35 -0400 (0:00:03.535) 0:02:26.866 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012512", "end": "2026-04-20 15:50:37.174730", "rc": 0, "start": "2026-04-20 15:50:37.162218" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 72e0c453-925f-4c37-ac54-222606f3f95e Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 924187 Threads: 2 Salt: f4 c0 ba 23 15 c9 03 7c 57 ea 3d 78 9e e0 3d 52 9d 34 0e bc 61 7f 35 c0 70 31 56 15 5c 42 43 87 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 117448 Salt: 10 87 58 9c c9 23 39 50 e3 ff 4b 96 69 a9 04 c4 27 db 6d 80 65 ce 83 57 db 97 c9 4e cf 9b 6e ac Digest: 4c 99 62 13 b6 6a a4 07 e5 42 3b 73 d8 11 19 1a d7 4c 32 3d f1 87 d3 b4 ac d5 6a 00 2b d8 0f dc TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:50:37 -0400 (0:00:01.412) 0:02:28.278 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:50:37 -0400 (0:00:00.110) 0:02:28.389 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:50:37 -0400 (0:00:00.214) 0:02:28.604 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:50:37 -0400 (0:00:00.149) 0:02:28.753 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:50:37 -0400 (0:00:00.074) 0:02:28.828 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:50:38 -0400 (0:00:00.457) 0:02:29.286 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:50:38 -0400 (0:00:00.164) 0:02:29.451 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:50:38 -0400 (0:00:00.164) 0:02:29.615 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-72e0c453-925f-4c37-ac54-222606f3f95e /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:50:38 -0400 (0:00:00.230) 0:02:29.845 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:50:39 -0400 (0:00:00.336) 0:02:30.182 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:50:39 -0400 (0:00:00.339) 0:02:30.522 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:50:39 -0400 (0:00:00.285) 0:02:30.807 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:50:40 -0400 (0:00:00.306) 0:02:31.114 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:50:40 -0400 (0:00:00.208) 0:02:31.322 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:50:40 -0400 (0:00:00.214) 0:02:31.537 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:50:40 -0400 (0:00:00.191) 0:02:31.729 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:50:40 -0400 (0:00:00.172) 0:02:31.901 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:50:41 -0400 (0:00:00.223) 0:02:32.124 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:50:41 -0400 (0:00:00.215) 0:02:32.340 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:50:41 -0400 (0:00:00.240) 0:02:32.580 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:50:41 -0400 (0:00:00.253) 0:02:32.834 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:50:42 -0400 (0:00:00.165) 0:02:32.999 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:50:42 -0400 (0:00:00.231) 0:02:33.231 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:50:42 -0400 (0:00:00.113) 0:02:33.345 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:50:42 -0400 (0:00:00.101) 0:02:33.446 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:50:42 -0400 (0:00:00.207) 0:02:33.654 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:50:42 -0400 (0:00:00.269) 0:02:33.924 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:50:43 -0400 (0:00:00.237) 0:02:34.161 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:50:43 -0400 (0:00:00.176) 0:02:34.337 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:50:43 -0400 (0:00:00.164) 0:02:34.501 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:50:43 -0400 (0:00:00.221) 0:02:34.723 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:50:43 -0400 (0:00:00.198) 0:02:34.921 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:50:44 -0400 (0:00:00.201) 0:02:35.123 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:50:44 -0400 (0:00:00.193) 0:02:35.316 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:50:44 -0400 (0:00:00.096) 0:02:35.412 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:50:44 -0400 (0:00:00.180) 0:02:35.592 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:50:44 -0400 (0:00:00.238) 0:02:35.831 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:50:45 -0400 (0:00:00.252) 0:02:36.084 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:50:45 -0400 (0:00:00.282) 0:02:36.366 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:50:45 -0400 (0:00:00.207) 0:02:36.574 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:50:45 -0400 (0:00:00.175) 0:02:36.750 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:50:46 -0400 (0:00:00.281) 0:02:37.031 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:50:46 -0400 (0:00:00.195) 0:02:37.227 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:50:46 -0400 (0:00:00.180) 0:02:37.407 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:50:46 -0400 (0:00:00.205) 0:02:37.613 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:50:46 -0400 (0:00:00.191) 0:02:37.805 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:50:47 -0400 (0:00:00.189) 0:02:37.994 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:50:47 -0400 (0:00:00.237) 0:02:38.231 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:50:47 -0400 (0:00:00.265) 0:02:38.497 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:50:47 -0400 (0:00:00.302) 0:02:38.799 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:50:48 -0400 (0:00:00.432) 0:02:39.232 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:50:48 -0400 (0:00:00.269) 0:02:39.501 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:50:48 -0400 (0:00:00.175) 0:02:39.677 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:50:48 -0400 (0:00:00.176) 0:02:39.854 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:50:49 -0400 (0:00:00.195) 0:02:40.049 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:50:49 -0400 (0:00:00.220) 0:02:40.270 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:50:49 -0400 (0:00:00.224) 0:02:40.495 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:50:49 -0400 (0:00:00.209) 0:02:40.704 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:50:50 -0400 (0:00:00.245) 0:02:40.949 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:50:50 -0400 (0:00:00.112) 0:02:41.061 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:118 Monday 20 April 2026 15:50:53 -0400 (0:00:03.043) 0:02:44.104 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:50:53 -0400 (0:00:00.333) 0:02:44.438 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:50:53 -0400 (0:00:00.213) 0:02:44.651 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:50:53 -0400 (0:00:00.161) 0:02:44.812 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:50:53 -0400 (0:00:00.001) 0:02:44.814 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:50:54 -0400 (0:00:00.214) 0:02:45.028 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:50:54 -0400 (0:00:00.341) 0:02:45.369 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:50:54 -0400 (0:00:00.173) 0:02:45.543 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:50:56 -0400 (0:00:02.339) 0:02:47.882 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:50:58 -0400 (0:00:01.250) 0:02:49.133 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:50:58 -0400 (0:00:00.521) 0:02:49.654 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:50:59 -0400 (0:00:00.318) 0:02:49.973 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:50:59 -0400 (0:00:00.173) 0:02:50.146 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:50:59 -0400 (0:00:00.187) 0:02:50.334 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:50:59 -0400 (0:00:00.166) 0:02:50.501 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:50:59 -0400 (0:00:00.385) 0:02:50.886 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:51:00 -0400 (0:00:00.172) 0:02:51.059 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:51:00 -0400 (0:00:00.167) 0:02:51.226 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:51:04 -0400 (0:00:03.856) 0:02:55.082 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:51:04 -0400 (0:00:00.136) 0:02:55.218 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:51:04 -0400 (0:00:00.161) 0:02:55.380 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:51:09 -0400 (0:00:05.080) 0:03:00.461 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:51:09 -0400 (0:00:00.112) 0:03:00.573 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:51:09 -0400 (0:00:00.102) 0:03:00.676 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:51:09 -0400 (0:00:00.091) 0:03:00.767 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:51:09 -0400 (0:00:00.070) 0:03:00.838 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:51:12 -0400 (0:00:03.027) 0:03:03.866 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:51:15 -0400 (0:00:02.343) 0:03:06.209 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:51:15 -0400 (0:00:00.370) 0:03:06.580 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-72e0c453-925f-4c37-ac54-222606f3f95e' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:51:20 -0400 (0:00:05.039) 0:03:11.619 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-72e0c453-925f-4c37-ac54-222606f3f95e' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:51:20 -0400 (0:00:00.103) 0:03:11.723 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:51:20 -0400 (0:00:00.109) 0:03:11.832 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:51:20 -0400 (0:00:00.082) 0:03:11.915 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:51:21 -0400 (0:00:00.112) 0:03:12.027 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:51:21 -0400 (0:00:00.072) 0:03:12.099 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714652.9850411, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714652.9850411, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714652.9850411, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2653090384", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:51:21 -0400 (0:00:00.806) 0:03:12.906 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:139 Monday 20 April 2026 15:51:22 -0400 (0:00:00.155) 0:03:13.062 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:51:22 -0400 (0:00:00.224) 0:03:13.287 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:51:22 -0400 (0:00:00.002) 0:03:13.290 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:51:22 -0400 (0:00:00.241) 0:03:13.531 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:51:22 -0400 (0:00:00.066) 0:03:13.597 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:51:22 -0400 (0:00:00.063) 0:03:13.661 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:51:24 -0400 (0:00:01.481) 0:03:15.142 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:51:25 -0400 (0:00:00.869) 0:03:16.012 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:51:25 -0400 (0:00:00.191) 0:03:16.203 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:51:25 -0400 (0:00:00.147) 0:03:16.350 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:51:25 -0400 (0:00:00.160) 0:03:16.510 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:51:25 -0400 (0:00:00.039) 0:03:16.550 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:51:25 -0400 (0:00:00.027) 0:03:16.577 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:51:25 -0400 (0:00:00.109) 0:03:16.687 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:51:25 -0400 (0:00:00.035) 0:03:16.722 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:51:25 -0400 (0:00:00.069) 0:03:16.792 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:51:29 -0400 (0:00:03.412) 0:03:20.205 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:51:29 -0400 (0:00:00.182) 0:03:20.388 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:51:29 -0400 (0:00:00.155) 0:03:20.543 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:51:34 -0400 (0:00:05.106) 0:03:25.649 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:51:34 -0400 (0:00:00.237) 0:03:25.887 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:51:35 -0400 (0:00:00.142) 0:03:26.030 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:51:35 -0400 (0:00:00.182) 0:03:26.212 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:51:35 -0400 (0:00:00.171) 0:03:26.384 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:51:38 -0400 (0:00:03.251) 0:03:29.635 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:51:41 -0400 (0:00:02.308) 0:03:31.944 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:51:41 -0400 (0:00:00.198) 0:03:32.142 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:51:46 -0400 (0:00:05.245) 0:03:37.388 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:51:46 -0400 (0:00:00.094) 0:03:37.482 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714607.647238, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fc996f6da636c9b48e1db0fcae34eca82a605e92", "ctime": 1776714607.644238, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714607.644238, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:51:47 -0400 (0:00:01.095) 0:03:38.578 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:51:48 -0400 (0:00:01.120) 0:03:39.698 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:51:48 -0400 (0:00:00.162) 0:03:39.860 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:51:49 -0400 (0:00:00.111) 0:03:39.972 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:51:49 -0400 (0:00:00.124) 0:03:40.097 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:51:49 -0400 (0:00:00.102) 0:03:40.199 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-72e0c453-925f-4c37-ac54-222606f3f95e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:51:50 -0400 (0:00:01.544) 0:03:41.744 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:51:52 -0400 (0:00:01.581) 0:03:43.325 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': 'UUID=02e40ae4-666f-4e23-b710-3d84226cb128', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:51:53 -0400 (0:00:01.385) 0:03:44.711 ********** skipping: [managed-node16] => (item={'src': 'UUID=02e40ae4-666f-4e23-b710-3d84226cb128', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:51:54 -0400 (0:00:00.274) 0:03:44.985 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:51:55 -0400 (0:00:01.688) 0:03:46.674 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714620.8371806, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f1d6d652137380b35c5b440f7ae13412c2464d25", "ctime": 1776714611.502221, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 505413764, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714611.5012212, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1774372734", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:51:56 -0400 (0:00:01.111) 0:03:47.785 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-72e0c453-925f-4c37-ac54-222606f3f95e', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-72e0c453-925f-4c37-ac54-222606f3f95e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:51:57 -0400 (0:00:01.115) 0:03:48.901 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:51:59 -0400 (0:00:01.826) 0:03:50.727 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:152 Monday 20 April 2026 15:52:01 -0400 (0:00:01.580) 0:03:52.308 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:52:01 -0400 (0:00:00.427) 0:03:52.736 ********** skipping: [managed-node16] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:52:01 -0400 (0:00:00.153) 0:03:52.889 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:52:02 -0400 (0:00:00.221) 0:03:53.111 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "02e40ae4-666f-4e23-b710-3d84226cb128" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:52:03 -0400 (0:00:01.247) 0:03:54.359 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003072", "end": "2026-04-20 15:52:04.678053", "rc": 0, "start": "2026-04-20 15:52:04.674981" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=02e40ae4-666f-4e23-b710-3d84226cb128 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:52:04 -0400 (0:00:01.367) 0:03:55.727 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002765", "end": "2026-04-20 15:52:05.661060", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:52:05.658295" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:52:05 -0400 (0:00:01.024) 0:03:56.751 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:52:05 -0400 (0:00:00.071) 0:03:56.823 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=02e40ae4-666f-4e23-b710-3d84226cb128', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:52:06 -0400 (0:00:00.187) 0:03:57.010 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:52:06 -0400 (0:00:00.224) 0:03:57.235 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:52:07 -0400 (0:00:01.078) 0:03:58.313 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:52:07 -0400 (0:00:00.221) 0:03:58.535 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:52:07 -0400 (0:00:00.244) 0:03:58.780 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:52:08 -0400 (0:00:00.287) 0:03:59.068 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:52:08 -0400 (0:00:00.292) 0:03:59.360 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:52:08 -0400 (0:00:00.209) 0:03:59.570 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:52:08 -0400 (0:00:00.177) 0:03:59.748 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:52:08 -0400 (0:00:00.167) 0:03:59.915 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:52:09 -0400 (0:00:00.079) 0:03:59.995 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:52:09 -0400 (0:00:00.137) 0:04:00.133 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:52:09 -0400 (0:00:00.152) 0:04:00.285 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:52:09 -0400 (0:00:00.095) 0:04:00.381 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=02e40ae4-666f-4e23-b710-3d84226cb128 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:52:09 -0400 (0:00:00.366) 0:04:00.748 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:52:09 -0400 (0:00:00.172) 0:04:00.921 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:52:10 -0400 (0:00:00.179) 0:04:01.101 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:52:10 -0400 (0:00:00.145) 0:04:01.247 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:52:10 -0400 (0:00:00.442) 0:04:01.689 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:52:10 -0400 (0:00:00.123) 0:04:01.812 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:52:11 -0400 (0:00:00.154) 0:04:01.967 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:52:11 -0400 (0:00:00.225) 0:04:02.192 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714706.252811, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714706.252811, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714706.252811, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:52:12 -0400 (0:00:01.331) 0:04:03.523 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:52:12 -0400 (0:00:00.197) 0:04:03.721 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:52:12 -0400 (0:00:00.122) 0:04:03.843 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:52:13 -0400 (0:00:00.196) 0:04:04.040 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:52:13 -0400 (0:00:00.157) 0:04:04.197 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:52:13 -0400 (0:00:00.167) 0:04:04.365 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:52:13 -0400 (0:00:00.160) 0:04:04.525 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:52:13 -0400 (0:00:00.183) 0:04:04.709 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:52:17 -0400 (0:00:03.285) 0:04:07.994 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:52:17 -0400 (0:00:00.163) 0:04:08.157 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:52:17 -0400 (0:00:00.140) 0:04:08.298 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:52:17 -0400 (0:00:00.202) 0:04:08.500 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:52:17 -0400 (0:00:00.151) 0:04:08.652 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:52:17 -0400 (0:00:00.173) 0:04:08.825 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:52:18 -0400 (0:00:00.184) 0:04:09.010 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:52:18 -0400 (0:00:00.198) 0:04:09.209 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:52:18 -0400 (0:00:00.211) 0:04:09.420 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:52:18 -0400 (0:00:00.315) 0:04:09.735 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:52:19 -0400 (0:00:00.399) 0:04:10.135 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:52:19 -0400 (0:00:00.167) 0:04:10.303 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:52:19 -0400 (0:00:00.278) 0:04:10.581 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:52:19 -0400 (0:00:00.260) 0:04:10.841 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:52:20 -0400 (0:00:00.209) 0:04:11.051 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:52:20 -0400 (0:00:00.127) 0:04:11.178 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:52:20 -0400 (0:00:00.211) 0:04:11.389 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:52:20 -0400 (0:00:00.209) 0:04:11.599 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:52:20 -0400 (0:00:00.184) 0:04:11.783 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:52:21 -0400 (0:00:00.169) 0:04:11.953 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:52:21 -0400 (0:00:00.214) 0:04:12.167 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:52:21 -0400 (0:00:00.159) 0:04:12.326 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:52:21 -0400 (0:00:00.231) 0:04:12.558 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:52:21 -0400 (0:00:00.175) 0:04:12.733 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:52:22 -0400 (0:00:00.286) 0:04:13.020 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:52:22 -0400 (0:00:00.123) 0:04:13.143 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:52:22 -0400 (0:00:00.115) 0:04:13.259 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:52:22 -0400 (0:00:00.183) 0:04:13.442 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:52:22 -0400 (0:00:00.185) 0:04:13.628 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:52:22 -0400 (0:00:00.100) 0:04:13.728 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:52:22 -0400 (0:00:00.166) 0:04:13.894 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:52:23 -0400 (0:00:00.143) 0:04:14.038 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:52:23 -0400 (0:00:00.151) 0:04:14.190 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:52:23 -0400 (0:00:00.119) 0:04:14.309 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:52:23 -0400 (0:00:00.190) 0:04:14.499 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:52:23 -0400 (0:00:00.155) 0:04:14.655 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:52:23 -0400 (0:00:00.180) 0:04:14.836 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:52:24 -0400 (0:00:00.180) 0:04:15.016 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:52:24 -0400 (0:00:00.199) 0:04:15.216 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:52:24 -0400 (0:00:00.214) 0:04:15.431 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:52:24 -0400 (0:00:00.177) 0:04:15.608 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:52:24 -0400 (0:00:00.171) 0:04:15.780 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:52:25 -0400 (0:00:00.175) 0:04:15.955 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:52:25 -0400 (0:00:00.104) 0:04:16.060 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:52:25 -0400 (0:00:00.075) 0:04:16.135 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:52:25 -0400 (0:00:00.072) 0:04:16.207 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:52:25 -0400 (0:00:00.090) 0:04:16.297 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:52:25 -0400 (0:00:00.088) 0:04:16.385 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:52:25 -0400 (0:00:00.094) 0:04:16.480 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:52:25 -0400 (0:00:00.079) 0:04:16.559 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:52:25 -0400 (0:00:00.043) 0:04:16.602 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:52:25 -0400 (0:00:00.070) 0:04:16.673 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:52:25 -0400 (0:00:00.036) 0:04:16.709 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:52:25 -0400 (0:00:00.060) 0:04:16.770 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:52:25 -0400 (0:00:00.069) 0:04:16.839 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:52:25 -0400 (0:00:00.070) 0:04:16.910 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:52:26 -0400 (0:00:00.058) 0:04:16.968 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:52:26 -0400 (0:00:00.051) 0:04:17.020 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:52:26 -0400 (0:00:00.066) 0:04:17.086 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:52:26 -0400 (0:00:00.080) 0:04:17.167 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:52:26 -0400 (0:00:00.105) 0:04:17.273 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:158 Monday 20 April 2026 15:52:27 -0400 (0:00:00.831) 0:04:18.104 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:52:27 -0400 (0:00:00.309) 0:04:18.414 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:52:27 -0400 (0:00:00.138) 0:04:18.552 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:52:27 -0400 (0:00:00.120) 0:04:18.673 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:52:27 -0400 (0:00:00.089) 0:04:18.762 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:52:27 -0400 (0:00:00.087) 0:04:18.849 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:52:28 -0400 (0:00:00.369) 0:04:19.219 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:52:28 -0400 (0:00:00.176) 0:04:19.396 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:52:30 -0400 (0:00:01.894) 0:04:21.290 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:52:31 -0400 (0:00:01.074) 0:04:22.365 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:52:31 -0400 (0:00:00.424) 0:04:22.789 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:52:31 -0400 (0:00:00.110) 0:04:22.899 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:52:32 -0400 (0:00:00.172) 0:04:23.071 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:52:32 -0400 (0:00:00.181) 0:04:23.253 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:52:32 -0400 (0:00:00.161) 0:04:23.415 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:52:32 -0400 (0:00:00.405) 0:04:23.820 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:52:33 -0400 (0:00:00.119) 0:04:23.940 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:52:33 -0400 (0:00:00.215) 0:04:24.155 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:52:36 -0400 (0:00:03.541) 0:04:27.697 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:52:36 -0400 (0:00:00.115) 0:04:27.812 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:52:36 -0400 (0:00:00.107) 0:04:27.919 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:52:41 -0400 (0:00:04.338) 0:04:32.257 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:52:41 -0400 (0:00:00.239) 0:04:32.497 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:52:41 -0400 (0:00:00.095) 0:04:32.592 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:52:41 -0400 (0:00:00.094) 0:04:32.686 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:52:41 -0400 (0:00:00.118) 0:04:32.805 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:52:45 -0400 (0:00:03.267) 0:04:36.072 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service": { "name": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service": { "name": "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:52:47 -0400 (0:00:02.463) 0:04:38.535 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d72e0c453\x2d925f\x2d4c37\x2dac54\x2d222606f3f95e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "name": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda.device systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-72e0c453-925f-4c37-ac54-222606f3f95e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-72e0c453-925f-4c37-ac54-222606f3f95e /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-72e0c453-925f-4c37-ac54-222606f3f95e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:51:55 EDT", "StateChangeTimestampMonotonic": "1942559629", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d925f\x2d4c37\x2dac54\x2d222606f3f95e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "name": "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:52:50 -0400 (0:00:03.199) 0:04:41.734 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:52:56 -0400 (0:00:05.445) 0:04:47.180 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:52:56 -0400 (0:00:00.076) 0:04:47.256 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d72e0c453\x2d925f\x2d4c37\x2dac54\x2d222606f3f95e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "name": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d72e0c453\\x2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d925f\x2d4c37\x2dac54\x2d222606f3f95e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "name": "systemd-cryptsetup@luks…2d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d925f\\x2d4c37\\x2dac54\\x2d222606f3f95e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:52:58 -0400 (0:00:01.950) 0:04:49.207 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:52:58 -0400 (0:00:00.140) 0:04:49.347 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:52:58 -0400 (0:00:00.217) 0:04:49.564 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:52:58 -0400 (0:00:00.105) 0:04:49.670 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714747.0526352, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714747.0526352, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714747.0526352, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1288812580", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:52:59 -0400 (0:00:00.869) 0:04:50.539 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:179 Monday 20 April 2026 15:52:59 -0400 (0:00:00.104) 0:04:50.644 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:53:00 -0400 (0:00:00.303) 0:04:50.947 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:53:00 -0400 (0:00:00.001) 0:04:50.949 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:53:00 -0400 (0:00:00.100) 0:04:51.050 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:53:00 -0400 (0:00:00.163) 0:04:51.213 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:53:00 -0400 (0:00:00.138) 0:04:51.351 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:53:02 -0400 (0:00:02.004) 0:04:53.356 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:53:03 -0400 (0:00:00.719) 0:04:54.076 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:53:03 -0400 (0:00:00.219) 0:04:54.295 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:53:03 -0400 (0:00:00.097) 0:04:54.393 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:53:03 -0400 (0:00:00.047) 0:04:54.441 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:53:03 -0400 (0:00:00.037) 0:04:54.478 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:53:03 -0400 (0:00:00.115) 0:04:54.594 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:53:03 -0400 (0:00:00.232) 0:04:54.826 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:53:04 -0400 (0:00:00.141) 0:04:54.967 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:53:04 -0400 (0:00:00.229) 0:04:55.197 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:53:07 -0400 (0:00:03.184) 0:04:58.382 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:53:07 -0400 (0:00:00.134) 0:04:58.517 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:53:07 -0400 (0:00:00.163) 0:04:58.680 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:53:12 -0400 (0:00:05.237) 0:05:03.918 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:53:13 -0400 (0:00:00.436) 0:05:04.354 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:53:13 -0400 (0:00:00.231) 0:05:04.586 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:53:13 -0400 (0:00:00.096) 0:05:04.683 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:53:13 -0400 (0:00:00.103) 0:05:04.786 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:53:16 -0400 (0:00:02.981) 0:05:07.768 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:53:19 -0400 (0:00:02.487) 0:05:10.255 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:53:19 -0400 (0:00:00.224) 0:05:10.480 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:53:32 -0400 (0:00:13.337) 0:05:23.818 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:53:32 -0400 (0:00:00.094) 0:05:23.912 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714713.5987794, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "86235800fc7524fdbede934b00a0ba0c95a4fe18", "ctime": 1776714713.5957794, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714713.5957794, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:53:33 -0400 (0:00:00.737) 0:05:24.650 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:53:34 -0400 (0:00:00.960) 0:05:25.611 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:53:34 -0400 (0:00:00.201) 0:05:25.812 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:53:35 -0400 (0:00:00.267) 0:05:26.080 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:53:35 -0400 (0:00:00.122) 0:05:26.203 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:53:35 -0400 (0:00:00.069) 0:05:26.273 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': 'UUID=02e40ae4-666f-4e23-b710-3d84226cb128', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=02e40ae4-666f-4e23-b710-3d84226cb128" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:53:36 -0400 (0:00:01.092) 0:05:27.365 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:53:38 -0400 (0:00:01.636) 0:05:29.002 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:53:39 -0400 (0:00:01.351) 0:05:30.354 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:53:39 -0400 (0:00:00.162) 0:05:30.516 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:53:41 -0400 (0:00:01.421) 0:05:31.938 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714725.6597273, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714717.848761, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 119537870, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776714717.847761, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1469735413", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:53:42 -0400 (0:00:01.215) 0:05:33.154 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-f7031584-2dc9-4b44-87e9-1952aed231a9', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:53:43 -0400 (0:00:01.100) 0:05:34.254 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:53:45 -0400 (0:00:01.747) 0:05:36.002 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:192 Monday 20 April 2026 15:53:46 -0400 (0:00:01.185) 0:05:37.187 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:53:46 -0400 (0:00:00.519) 0:05:37.707 ********** skipping: [managed-node16] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:53:46 -0400 (0:00:00.105) 0:05:37.812 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:53:47 -0400 (0:00:00.190) 0:05:38.003 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "size": "10G", "type": "crypt", "uuid": "8f907b1a-ecea-453a-8449-377b318ab745" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f7031584-2dc9-4b44-87e9-1952aed231a9" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:53:48 -0400 (0:00:01.008) 0:05:39.012 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002994", "end": "2026-04-20 15:53:49.176787", "rc": 0, "start": "2026-04-20 15:53:49.173793" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:53:49 -0400 (0:00:01.268) 0:05:40.281 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002944", "end": "2026-04-20 15:53:50.385538", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:53:50.382594" } STDOUT: luks-f7031584-2dc9-4b44-87e9-1952aed231a9 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:53:50 -0400 (0:00:01.232) 0:05:41.514 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:53:50 -0400 (0:00:00.147) 0:05:41.661 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:53:50 -0400 (0:00:00.195) 0:05:41.857 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:53:51 -0400 (0:00:00.318) 0:05:42.176 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:53:52 -0400 (0:00:00.967) 0:05:43.143 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:53:52 -0400 (0:00:00.280) 0:05:43.423 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:53:52 -0400 (0:00:00.176) 0:05:43.600 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:53:52 -0400 (0:00:00.192) 0:05:43.792 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:53:53 -0400 (0:00:00.272) 0:05:44.065 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:53:53 -0400 (0:00:00.310) 0:05:44.375 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:53:53 -0400 (0:00:00.266) 0:05:44.642 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:53:53 -0400 (0:00:00.270) 0:05:44.913 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:53:54 -0400 (0:00:00.226) 0:05:45.139 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:53:54 -0400 (0:00:00.200) 0:05:45.339 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:53:54 -0400 (0:00:00.227) 0:05:45.566 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:53:54 -0400 (0:00:00.119) 0:05:45.686 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:53:55 -0400 (0:00:00.332) 0:05:46.019 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:53:55 -0400 (0:00:00.196) 0:05:46.215 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:53:55 -0400 (0:00:00.144) 0:05:46.360 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:53:55 -0400 (0:00:00.120) 0:05:46.480 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:53:55 -0400 (0:00:00.250) 0:05:46.730 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:53:55 -0400 (0:00:00.150) 0:05:46.880 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:53:56 -0400 (0:00:00.174) 0:05:47.055 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:53:56 -0400 (0:00:00.197) 0:05:47.253 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714812.5713525, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714812.5713525, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714812.5713525, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:53:57 -0400 (0:00:01.072) 0:05:48.325 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:53:57 -0400 (0:00:00.110) 0:05:48.436 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:53:57 -0400 (0:00:00.193) 0:05:48.630 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:53:57 -0400 (0:00:00.142) 0:05:48.773 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:53:57 -0400 (0:00:00.121) 0:05:48.895 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:53:58 -0400 (0:00:00.063) 0:05:48.958 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:53:58 -0400 (0:00:00.151) 0:05:49.110 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714812.690352, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714812.690352, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 208338, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714812.690352, "nlink": 1, "path": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:53:59 -0400 (0:00:01.461) 0:05:50.571 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:54:02 -0400 (0:00:03.112) 0:05:53.684 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010660", "end": "2026-04-20 15:54:03.906416", "rc": 0, "start": "2026-04-20 15:54:03.895756" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: f7031584-2dc9-4b44-87e9-1952aed231a9 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: 75 5a 26 00 51 f5 fe a9 5b a3 f7 55 90 7f 8b d9 8b 44 01 29 5e d1 ee 49 55 ad 0d 5f 3c 07 32 4f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 118940 Salt: 39 60 81 92 4c 43 d9 73 f6 9e 2a b7 9a 1e e7 62 8e 7e d6 30 c9 97 20 76 d4 b3 c7 5a ab 7a 3f 6c Digest: 93 b4 e5 8d 40 3d 14 8c 28 f2 1c dd cd 0d fa ee d7 de f5 87 3a c3 e3 92 71 d1 13 4d 7d d7 28 cc TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:54:04 -0400 (0:00:01.322) 0:05:55.006 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:54:04 -0400 (0:00:00.197) 0:05:55.204 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:54:04 -0400 (0:00:00.270) 0:05:55.475 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:54:04 -0400 (0:00:00.251) 0:05:55.727 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:54:05 -0400 (0:00:00.209) 0:05:55.937 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:54:05 -0400 (0:00:00.389) 0:05:56.326 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:54:05 -0400 (0:00:00.292) 0:05:56.618 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:54:05 -0400 (0:00:00.174) 0:05:56.793 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f7031584-2dc9-4b44-87e9-1952aed231a9 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:54:06 -0400 (0:00:00.236) 0:05:57.030 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:54:06 -0400 (0:00:00.200) 0:05:57.230 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:54:06 -0400 (0:00:00.221) 0:05:57.451 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:54:06 -0400 (0:00:00.135) 0:05:57.587 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:54:06 -0400 (0:00:00.125) 0:05:57.712 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:54:06 -0400 (0:00:00.144) 0:05:57.857 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:54:07 -0400 (0:00:00.221) 0:05:58.079 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:54:07 -0400 (0:00:00.159) 0:05:58.239 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:54:07 -0400 (0:00:00.192) 0:05:58.432 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:54:07 -0400 (0:00:00.198) 0:05:58.630 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:54:07 -0400 (0:00:00.170) 0:05:58.801 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:54:08 -0400 (0:00:00.143) 0:05:58.944 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:54:08 -0400 (0:00:00.165) 0:05:59.110 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:54:08 -0400 (0:00:00.156) 0:05:59.267 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:54:08 -0400 (0:00:00.152) 0:05:59.419 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:54:08 -0400 (0:00:00.126) 0:05:59.546 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:54:08 -0400 (0:00:00.098) 0:05:59.645 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:54:08 -0400 (0:00:00.110) 0:05:59.755 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:54:08 -0400 (0:00:00.119) 0:05:59.875 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:54:09 -0400 (0:00:00.175) 0:06:00.051 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:54:09 -0400 (0:00:00.201) 0:06:00.252 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:54:09 -0400 (0:00:00.184) 0:06:00.437 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:54:09 -0400 (0:00:00.143) 0:06:00.580 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:54:09 -0400 (0:00:00.104) 0:06:00.684 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:54:09 -0400 (0:00:00.128) 0:06:00.812 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:54:10 -0400 (0:00:00.118) 0:06:00.931 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:54:10 -0400 (0:00:00.197) 0:06:01.129 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:54:10 -0400 (0:00:00.237) 0:06:01.366 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:54:10 -0400 (0:00:00.131) 0:06:01.497 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:54:10 -0400 (0:00:00.203) 0:06:01.701 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:54:10 -0400 (0:00:00.136) 0:06:01.837 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:54:11 -0400 (0:00:00.235) 0:06:02.073 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:54:11 -0400 (0:00:00.214) 0:06:02.288 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:54:11 -0400 (0:00:00.197) 0:06:02.485 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:54:11 -0400 (0:00:00.270) 0:06:02.756 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:54:12 -0400 (0:00:00.210) 0:06:02.967 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:54:12 -0400 (0:00:00.129) 0:06:03.096 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:54:12 -0400 (0:00:00.156) 0:06:03.253 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:54:12 -0400 (0:00:00.232) 0:06:03.486 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:54:12 -0400 (0:00:00.215) 0:06:03.702 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:54:12 -0400 (0:00:00.138) 0:06:03.840 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:54:13 -0400 (0:00:00.188) 0:06:04.029 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:54:13 -0400 (0:00:00.138) 0:06:04.168 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:54:13 -0400 (0:00:00.141) 0:06:04.309 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:54:13 -0400 (0:00:00.121) 0:06:04.430 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:54:13 -0400 (0:00:00.375) 0:06:04.805 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:54:14 -0400 (0:00:00.150) 0:06:04.956 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:54:14 -0400 (0:00:00.096) 0:06:05.053 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:54:14 -0400 (0:00:00.112) 0:06:05.165 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:54:14 -0400 (0:00:00.180) 0:06:05.346 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:54:14 -0400 (0:00:00.147) 0:06:05.494 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:199 Monday 20 April 2026 15:54:14 -0400 (0:00:00.099) 0:06:05.593 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:54:15 -0400 (0:00:00.529) 0:06:06.123 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:54:15 -0400 (0:00:00.178) 0:06:06.302 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:54:15 -0400 (0:00:00.136) 0:06:06.438 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:54:15 -0400 (0:00:00.001) 0:06:06.440 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:54:15 -0400 (0:00:00.229) 0:06:06.670 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:54:15 -0400 (0:00:00.197) 0:06:06.868 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:54:16 -0400 (0:00:00.111) 0:06:06.979 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:54:18 -0400 (0:00:02.411) 0:06:09.390 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:54:20 -0400 (0:00:01.719) 0:06:11.110 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:54:20 -0400 (0:00:00.568) 0:06:11.679 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:54:20 -0400 (0:00:00.123) 0:06:11.802 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:54:21 -0400 (0:00:00.161) 0:06:11.964 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:54:21 -0400 (0:00:00.188) 0:06:12.152 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:54:21 -0400 (0:00:00.197) 0:06:12.350 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:54:21 -0400 (0:00:00.347) 0:06:12.698 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:54:21 -0400 (0:00:00.163) 0:06:12.861 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:54:22 -0400 (0:00:00.205) 0:06:13.067 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:54:25 -0400 (0:00:03.806) 0:06:16.873 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:54:26 -0400 (0:00:00.325) 0:06:17.199 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:54:26 -0400 (0:00:00.312) 0:06:17.511 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:54:31 -0400 (0:00:04.879) 0:06:22.390 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:54:31 -0400 (0:00:00.282) 0:06:22.672 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:54:31 -0400 (0:00:00.071) 0:06:22.744 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:54:32 -0400 (0:00:00.188) 0:06:22.932 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:54:32 -0400 (0:00:00.133) 0:06:23.065 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:54:35 -0400 (0:00:03.687) 0:06:26.752 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:54:38 -0400 (0:00:02.959) 0:06:29.712 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:54:39 -0400 (0:00:00.229) 0:06:29.941 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:54:43 -0400 (0:00:04.975) 0:06:34.916 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:54:44 -0400 (0:00:00.154) 0:06:35.071 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:54:44 -0400 (0:00:00.342) 0:06:35.413 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:54:44 -0400 (0:00:00.151) 0:06:35.565 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:54:45 -0400 (0:00:00.383) 0:06:35.948 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:219 Monday 20 April 2026 15:54:45 -0400 (0:00:00.186) 0:06:36.135 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:54:45 -0400 (0:00:00.661) 0:06:36.797 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:54:45 -0400 (0:00:00.001) 0:06:36.798 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:54:46 -0400 (0:00:00.190) 0:06:36.989 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:54:46 -0400 (0:00:00.355) 0:06:37.345 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:54:46 -0400 (0:00:00.065) 0:06:37.411 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:54:48 -0400 (0:00:01.526) 0:06:38.937 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:54:48 -0400 (0:00:00.790) 0:06:39.728 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:54:49 -0400 (0:00:00.322) 0:06:40.050 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:54:49 -0400 (0:00:00.166) 0:06:40.217 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:54:49 -0400 (0:00:00.170) 0:06:40.387 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:54:49 -0400 (0:00:00.153) 0:06:40.541 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:54:49 -0400 (0:00:00.095) 0:06:40.636 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:54:49 -0400 (0:00:00.273) 0:06:40.910 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:54:50 -0400 (0:00:00.141) 0:06:41.052 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:54:50 -0400 (0:00:00.197) 0:06:41.249 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:54:53 -0400 (0:00:03.514) 0:06:44.764 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:54:53 -0400 (0:00:00.057) 0:06:44.821 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:54:53 -0400 (0:00:00.037) 0:06:44.859 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:54:58 -0400 (0:00:04.731) 0:06:49.591 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:54:58 -0400 (0:00:00.231) 0:06:49.822 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:54:59 -0400 (0:00:00.157) 0:06:49.979 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:54:59 -0400 (0:00:00.076) 0:06:50.056 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:54:59 -0400 (0:00:00.054) 0:06:50.110 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:55:02 -0400 (0:00:03.462) 0:06:53.573 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:55:05 -0400 (0:00:02.549) 0:06:56.122 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:55:05 -0400 (0:00:00.306) 0:06:56.429 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:55:19 -0400 (0:00:14.066) 0:07:10.495 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:55:19 -0400 (0:00:00.126) 0:07:10.622 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714819.2853236, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ea0bc038fbeb5875505d2b92a32904ad5933b6df", "ctime": 1776714819.2823236, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714819.2823236, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:55:20 -0400 (0:00:01.275) 0:07:11.897 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:55:22 -0400 (0:00:01.099) 0:07:12.997 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:55:22 -0400 (0:00:00.280) 0:07:13.278 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:55:22 -0400 (0:00:00.285) 0:07:13.564 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:55:22 -0400 (0:00:00.057) 0:07:13.621 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:55:22 -0400 (0:00:00.042) 0:07:13.664 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f7031584-2dc9-4b44-87e9-1952aed231a9" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:55:23 -0400 (0:00:01.165) 0:07:14.829 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:55:25 -0400 (0:00:01.664) 0:07:16.494 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:55:26 -0400 (0:00:01.193) 0:07:17.687 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:55:27 -0400 (0:00:00.286) 0:07:17.973 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:55:28 -0400 (0:00:01.737) 0:07:19.711 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714830.3842757, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "18c30a5f6c5aca5960498fda1acdfb08aa9fb277", "ctime": 1776714823.2063067, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 268435653, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714823.2043068, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "687477679", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:55:30 -0400 (0:00:01.380) 0:07:21.092 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda', 'name': 'luks-f7031584-2dc9-4b44-87e9-1952aed231a9', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:55:32 -0400 (0:00:02.472) 0:07:23.564 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:55:34 -0400 (0:00:01.665) 0:07:25.230 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:236 Monday 20 April 2026 15:55:35 -0400 (0:00:01.108) 0:07:26.339 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:55:35 -0400 (0:00:00.420) 0:07:26.759 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:55:36 -0400 (0:00:00.284) 0:07:27.044 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:55:36 -0400 (0:00:00.133) 0:07:27.178 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "size": "4G", "type": "crypt", "uuid": "6fb6b983-6503-43df-b691-eeab283cfd41" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "228a619c-db48-4cff-ab7e-39c8c44696a2" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:55:37 -0400 (0:00:01.138) 0:07:28.316 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002720", "end": "2026-04-20 15:55:38.350331", "rc": 0, "start": "2026-04-20 15:55:38.347611" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:55:38 -0400 (0:00:01.130) 0:07:29.447 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002806", "end": "2026-04-20 15:55:39.645000", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:55:39.642194" } STDOUT: luks-228a619c-db48-4cff-ab7e-39c8c44696a2 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:55:39 -0400 (0:00:01.259) 0:07:30.706 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:55:40 -0400 (0:00:00.300) 0:07:31.007 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:55:40 -0400 (0:00:00.203) 0:07:31.210 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:55:40 -0400 (0:00:00.086) 0:07:31.297 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:55:40 -0400 (0:00:00.177) 0:07:31.474 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:55:41 -0400 (0:00:00.461) 0:07:31.935 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:55:41 -0400 (0:00:00.174) 0:07:32.110 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:55:41 -0400 (0:00:00.236) 0:07:32.346 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:55:41 -0400 (0:00:00.182) 0:07:32.529 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:55:41 -0400 (0:00:00.271) 0:07:32.801 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:55:42 -0400 (0:00:00.128) 0:07:32.929 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:55:42 -0400 (0:00:00.162) 0:07:33.092 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:55:42 -0400 (0:00:00.164) 0:07:33.257 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:55:42 -0400 (0:00:00.172) 0:07:33.429 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:55:42 -0400 (0:00:00.227) 0:07:33.657 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:55:44 -0400 (0:00:01.275) 0:07:34.933 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:55:44 -0400 (0:00:00.061) 0:07:34.994 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:55:44 -0400 (0:00:00.305) 0:07:35.300 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:55:44 -0400 (0:00:00.099) 0:07:35.400 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:55:44 -0400 (0:00:00.116) 0:07:35.516 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:55:44 -0400 (0:00:00.167) 0:07:35.683 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:55:44 -0400 (0:00:00.149) 0:07:35.833 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:55:45 -0400 (0:00:00.193) 0:07:36.026 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:55:45 -0400 (0:00:00.084) 0:07:36.110 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:55:45 -0400 (0:00:00.709) 0:07:36.820 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:55:46 -0400 (0:00:00.159) 0:07:36.980 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:55:46 -0400 (0:00:00.134) 0:07:37.115 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:55:46 -0400 (0:00:00.139) 0:07:37.255 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:55:46 -0400 (0:00:00.123) 0:07:37.378 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:55:46 -0400 (0:00:00.383) 0:07:37.762 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:55:47 -0400 (0:00:00.166) 0:07:37.928 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:55:47 -0400 (0:00:00.373) 0:07:38.301 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:55:47 -0400 (0:00:00.204) 0:07:38.505 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:55:47 -0400 (0:00:00.391) 0:07:38.897 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:55:48 -0400 (0:00:00.145) 0:07:39.043 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:55:48 -0400 (0:00:00.180) 0:07:39.224 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:55:48 -0400 (0:00:00.091) 0:07:39.315 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:55:48 -0400 (0:00:00.092) 0:07:39.408 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:55:48 -0400 (0:00:00.433) 0:07:39.842 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:55:49 -0400 (0:00:00.211) 0:07:40.053 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:55:49 -0400 (0:00:00.559) 0:07:40.613 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:55:49 -0400 (0:00:00.147) 0:07:40.760 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:55:49 -0400 (0:00:00.114) 0:07:40.875 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:55:50 -0400 (0:00:00.136) 0:07:41.011 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:55:50 -0400 (0:00:00.113) 0:07:41.124 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:55:50 -0400 (0:00:00.135) 0:07:41.260 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:55:50 -0400 (0:00:00.123) 0:07:41.383 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:55:50 -0400 (0:00:00.145) 0:07:41.529 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:55:50 -0400 (0:00:00.132) 0:07:41.662 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:55:51 -0400 (0:00:00.280) 0:07:41.942 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:55:51 -0400 (0:00:00.186) 0:07:42.129 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:55:52 -0400 (0:00:01.262) 0:07:43.391 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:55:52 -0400 (0:00:00.297) 0:07:43.688 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:55:52 -0400 (0:00:00.180) 0:07:43.869 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:55:53 -0400 (0:00:00.296) 0:07:44.165 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:55:53 -0400 (0:00:00.238) 0:07:44.404 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:55:53 -0400 (0:00:00.250) 0:07:44.654 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:55:53 -0400 (0:00:00.155) 0:07:44.810 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:55:54 -0400 (0:00:00.155) 0:07:44.965 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:55:54 -0400 (0:00:00.184) 0:07:45.150 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:55:54 -0400 (0:00:00.153) 0:07:45.303 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:55:54 -0400 (0:00:00.081) 0:07:45.385 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:55:54 -0400 (0:00:00.287) 0:07:45.672 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:55:55 -0400 (0:00:00.315) 0:07:45.988 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:55:55 -0400 (0:00:00.229) 0:07:46.217 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:55:55 -0400 (0:00:00.249) 0:07:46.467 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:55:55 -0400 (0:00:00.107) 0:07:46.575 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:55:55 -0400 (0:00:00.170) 0:07:46.745 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:55:55 -0400 (0:00:00.110) 0:07:46.855 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:55:56 -0400 (0:00:00.292) 0:07:47.148 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:55:56 -0400 (0:00:00.132) 0:07:47.280 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714919.206893, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714919.206893, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 223114, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714919.206893, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:55:57 -0400 (0:00:00.884) 0:07:48.164 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:55:57 -0400 (0:00:00.280) 0:07:48.444 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:55:57 -0400 (0:00:00.173) 0:07:48.617 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:55:57 -0400 (0:00:00.276) 0:07:48.894 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:55:58 -0400 (0:00:00.117) 0:07:49.012 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:55:58 -0400 (0:00:00.106) 0:07:49.118 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:55:58 -0400 (0:00:00.136) 0:07:49.255 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714919.3548923, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714919.3548923, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 223205, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714919.3548923, "nlink": 1, "path": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:55:59 -0400 (0:00:01.259) 0:07:50.514 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:56:03 -0400 (0:00:03.492) 0:07:54.007 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010183", "end": "2026-04-20 15:56:04.023517", "rc": 0, "start": "2026-04-20 15:56:04.013334" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 228a619c-db48-4cff-ab7e-39c8c44696a2 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 926692 Threads: 2 Salt: 37 f9 8d 75 c0 fb a0 6c 47 0e c1 03 44 e8 36 31 46 7c 6a f4 c8 c0 ff a0 49 3b c4 f3 e7 00 ad 95 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 30 9d de 24 54 85 63 0f de 83 8b 6c f8 3e f8 1b ad 6c d9 5b 00 da 0f 0a 7b 9e d6 3f 36 d6 91 ee Digest: a3 f1 34 5a 22 23 77 32 96 a3 32 1b 72 b6 36 68 af 58 b9 e8 e1 5b af 95 19 c0 0a fa 1c fd 09 b8 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:56:04 -0400 (0:00:01.132) 0:07:55.139 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:56:04 -0400 (0:00:00.152) 0:07:55.292 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:56:04 -0400 (0:00:00.248) 0:07:55.540 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:56:04 -0400 (0:00:00.203) 0:07:55.744 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:56:04 -0400 (0:00:00.175) 0:07:55.919 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:56:05 -0400 (0:00:00.139) 0:07:56.059 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:56:05 -0400 (0:00:00.140) 0:07:56.200 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:56:05 -0400 (0:00:00.188) 0:07:56.389 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-228a619c-db48-4cff-ab7e-39c8c44696a2 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:56:05 -0400 (0:00:00.319) 0:07:56.708 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:56:06 -0400 (0:00:00.233) 0:07:56.941 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:56:06 -0400 (0:00:00.200) 0:07:57.142 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:56:06 -0400 (0:00:00.129) 0:07:57.272 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:56:06 -0400 (0:00:00.091) 0:07:57.363 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:56:06 -0400 (0:00:00.070) 0:07:57.434 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:56:06 -0400 (0:00:00.029) 0:07:57.463 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:56:06 -0400 (0:00:00.026) 0:07:57.490 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:56:06 -0400 (0:00:00.024) 0:07:57.514 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:56:06 -0400 (0:00:00.032) 0:07:57.547 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:56:06 -0400 (0:00:00.110) 0:07:57.657 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:56:06 -0400 (0:00:00.072) 0:07:57.729 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:56:06 -0400 (0:00:00.088) 0:07:57.818 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:56:06 -0400 (0:00:00.078) 0:07:57.897 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:56:07 -0400 (0:00:00.040) 0:07:57.937 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:56:07 -0400 (0:00:00.112) 0:07:58.050 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:56:07 -0400 (0:00:00.135) 0:07:58.186 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:56:07 -0400 (0:00:00.083) 0:07:58.269 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:56:07 -0400 (0:00:00.035) 0:07:58.305 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:56:07 -0400 (0:00:00.068) 0:07:58.373 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:56:07 -0400 (0:00:00.082) 0:07:58.456 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:56:07 -0400 (0:00:00.050) 0:07:58.506 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:56:07 -0400 (0:00:00.035) 0:07:58.542 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:56:07 -0400 (0:00:00.026) 0:07:58.569 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:56:07 -0400 (0:00:00.039) 0:07:58.608 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:56:07 -0400 (0:00:00.063) 0:07:58.671 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:56:07 -0400 (0:00:00.047) 0:07:58.719 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:56:07 -0400 (0:00:00.112) 0:07:58.831 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:56:08 -0400 (0:00:00.139) 0:07:58.971 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:56:08 -0400 (0:00:00.112) 0:07:59.083 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:56:08 -0400 (0:00:00.137) 0:07:59.221 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:56:08 -0400 (0:00:00.090) 0:07:59.311 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:56:08 -0400 (0:00:00.099) 0:07:59.411 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:56:08 -0400 (0:00:00.136) 0:07:59.547 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:56:08 -0400 (0:00:00.142) 0:07:59.690 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:56:08 -0400 (0:00:00.063) 0:07:59.753 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:56:09 -0400 (0:00:00.222) 0:07:59.976 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:56:09 -0400 (0:00:00.256) 0:08:00.233 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:56:09 -0400 (0:00:00.111) 0:08:00.344 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:56:09 -0400 (0:00:00.068) 0:08:00.412 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:56:09 -0400 (0:00:00.103) 0:08:00.515 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:56:09 -0400 (0:00:00.100) 0:08:00.616 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:56:09 -0400 (0:00:00.083) 0:08:00.699 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:56:09 -0400 (0:00:00.085) 0:08:00.784 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:56:09 -0400 (0:00:00.078) 0:08:00.863 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:56:10 -0400 (0:00:00.079) 0:08:00.942 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:56:10 -0400 (0:00:00.086) 0:08:01.029 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:56:10 -0400 (0:00:00.116) 0:08:01.146 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:56:10 -0400 (0:00:00.075) 0:08:01.221 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:56:10 -0400 (0:00:00.145) 0:08:01.367 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:56:10 -0400 (0:00:00.159) 0:08:01.526 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:56:10 -0400 (0:00:00.109) 0:08:01.636 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:56:10 -0400 (0:00:00.134) 0:08:01.770 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:242 Monday 20 April 2026 15:56:11 -0400 (0:00:00.990) 0:08:02.760 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:56:12 -0400 (0:00:00.415) 0:08:03.175 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:56:12 -0400 (0:00:00.410) 0:08:03.586 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:56:12 -0400 (0:00:00.076) 0:08:03.662 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:56:12 -0400 (0:00:00.013) 0:08:03.675 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:56:12 -0400 (0:00:00.063) 0:08:03.739 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:56:12 -0400 (0:00:00.126) 0:08:03.865 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:56:13 -0400 (0:00:00.086) 0:08:03.952 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:56:14 -0400 (0:00:01.800) 0:08:05.753 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:56:15 -0400 (0:00:01.103) 0:08:06.857 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:56:16 -0400 (0:00:00.491) 0:08:07.348 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:56:16 -0400 (0:00:00.243) 0:08:07.592 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:56:16 -0400 (0:00:00.237) 0:08:07.829 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:56:17 -0400 (0:00:00.147) 0:08:07.976 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:56:17 -0400 (0:00:00.205) 0:08:08.182 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:56:17 -0400 (0:00:00.248) 0:08:08.431 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:56:17 -0400 (0:00:00.195) 0:08:08.627 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:56:17 -0400 (0:00:00.210) 0:08:08.837 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:56:21 -0400 (0:00:03.424) 0:08:12.261 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:56:21 -0400 (0:00:00.193) 0:08:12.455 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:56:21 -0400 (0:00:00.270) 0:08:12.726 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:56:27 -0400 (0:00:05.409) 0:08:18.135 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:56:27 -0400 (0:00:00.278) 0:08:18.414 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:56:27 -0400 (0:00:00.090) 0:08:18.504 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:56:27 -0400 (0:00:00.221) 0:08:18.726 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:56:27 -0400 (0:00:00.102) 0:08:18.828 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:56:31 -0400 (0:00:03.651) 0:08:22.480 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service": { "name": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service": { "name": "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:56:34 -0400 (0:00:02.795) 0:08:25.276 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2df7031584\x2d2dc9\x2d4b44\x2d87e9\x2d1952aed231a9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "name": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f7031584-2dc9-4b44-87e9-1952aed231a9", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f7031584-2dc9-4b44-87e9-1952aed231a9 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f7031584-2dc9-4b44-87e9-1952aed231a9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:55:28 EDT", "StateChangeTimestampMonotonic": "2155561038", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d2dc9\x2d4b44\x2d87e9\x2d1952aed231a9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "name": "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:56:37 -0400 (0:00:03.252) 0:08:28.529 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:56:42 -0400 (0:00:05.032) 0:08:33.561 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:56:42 -0400 (0:00:00.131) 0:08:33.693 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2df7031584\x2d2dc9\x2d4b44\x2d87e9\x2d1952aed231a9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "name": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df7031584\\x2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d2dc9\x2d4b44\x2d87e9\x2d1952aed231a9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "name": "systemd-cryptsetup@luks…2d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d2dc9\\x2d4b44\\x2d87e9\\x2d1952aed231a9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:56:45 -0400 (0:00:02.590) 0:08:36.283 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:56:45 -0400 (0:00:00.100) 0:08:36.384 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:56:45 -0400 (0:00:00.361) 0:08:36.745 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:56:45 -0400 (0:00:00.168) 0:08:36.914 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714971.644667, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714971.644667, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714971.644667, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1079304631", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:56:47 -0400 (0:00:01.033) 0:08:37.948 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:267 Monday 20 April 2026 15:56:47 -0400 (0:00:00.252) 0:08:38.200 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:56:47 -0400 (0:00:00.427) 0:08:38.627 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:56:47 -0400 (0:00:00.001) 0:08:38.629 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:56:47 -0400 (0:00:00.132) 0:08:38.762 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:56:48 -0400 (0:00:00.210) 0:08:38.973 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:56:48 -0400 (0:00:00.114) 0:08:39.087 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:56:50 -0400 (0:00:02.286) 0:08:41.374 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:56:51 -0400 (0:00:01.171) 0:08:42.545 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:56:51 -0400 (0:00:00.311) 0:08:42.857 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:56:52 -0400 (0:00:00.097) 0:08:42.955 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:56:52 -0400 (0:00:00.105) 0:08:43.061 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:56:52 -0400 (0:00:00.185) 0:08:43.246 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:56:52 -0400 (0:00:00.120) 0:08:43.367 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:56:52 -0400 (0:00:00.397) 0:08:43.764 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:56:52 -0400 (0:00:00.154) 0:08:43.918 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:56:53 -0400 (0:00:00.143) 0:08:44.062 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:56:56 -0400 (0:00:03.412) 0:08:47.474 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:56:56 -0400 (0:00:00.171) 0:08:47.646 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:56:56 -0400 (0:00:00.256) 0:08:47.903 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:57:02 -0400 (0:00:05.383) 0:08:53.286 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:57:02 -0400 (0:00:00.222) 0:08:53.509 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:57:02 -0400 (0:00:00.236) 0:08:53.745 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:57:03 -0400 (0:00:00.202) 0:08:53.947 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:57:03 -0400 (0:00:00.121) 0:08:54.069 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:57:06 -0400 (0:00:03.387) 0:08:57.456 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service": { "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service": { "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:57:09 -0400 (0:00:02.778) 0:09:00.235 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d228a619c\x2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-228a619c-db48-4cff-ab7e-39c8c44696a2 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-228a619c-db48-4cff-ab7e-39c8c44696a2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:56:37 EDT", "StateChangeTimestampMonotonic": "2224477935", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:57:12 -0400 (0:00:03.006) 0:09:03.241 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:57:18 -0400 (0:00:05.746) 0:09:08.988 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:57:18 -0400 (0:00:00.214) 0:09:09.202 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714926.5648613, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3e6554343770d53dc81a99ea0bac73007df7ed3d", "ctime": 1776714926.5618613, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714926.5618613, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:57:19 -0400 (0:00:01.406) 0:09:10.609 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:57:20 -0400 (0:00:01.275) 0:09:11.884 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d228a619c\x2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:56:37 EDT", "StateChangeTimestampMonotonic": "2224477935", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:57:24 -0400 (0:00:03.154) 0:09:15.039 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:57:24 -0400 (0:00:00.157) 0:09:15.196 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:57:24 -0400 (0:00:00.146) 0:09:15.343 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:57:24 -0400 (0:00:00.194) 0:09:15.537 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-228a619c-db48-4cff-ab7e-39c8c44696a2" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:57:26 -0400 (0:00:01.435) 0:09:16.973 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:57:27 -0400 (0:00:01.429) 0:09:18.403 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:57:28 -0400 (0:00:01.191) 0:09:19.595 ********** skipping: [managed-node16] => (item={'src': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:57:28 -0400 (0:00:00.263) 0:09:19.858 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:57:30 -0400 (0:00:01.502) 0:09:21.361 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776714939.6438048, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "471690417c1a31adb6bcf8b2de19dd980e393b7b", "ctime": 1776714932.404836, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 404750513, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714932.403836, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3779483739", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:57:31 -0400 (0:00:01.226) 0:09:22.587 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:57:33 -0400 (0:00:01.421) 0:09:24.008 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:57:35 -0400 (0:00:01.991) 0:09:26.000 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:284 Monday 20 April 2026 15:57:36 -0400 (0:00:01.322) 0:09:27.322 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:57:37 -0400 (0:00:00.715) 0:09:28.038 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:57:37 -0400 (0:00:00.349) 0:09:28.388 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:57:37 -0400 (0:00:00.138) 0:09:28.526 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:57:38 -0400 (0:00:01.049) 0:09:29.576 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002769", "end": "2026-04-20 15:57:39.550243", "rc": 0, "start": "2026-04-20 15:57:39.547474" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:57:39 -0400 (0:00:01.098) 0:09:30.674 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002636", "end": "2026-04-20 15:57:40.923281", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:57:40.920645" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:57:41 -0400 (0:00:01.279) 0:09:31.954 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:57:41 -0400 (0:00:00.206) 0:09:32.160 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:57:41 -0400 (0:00:00.088) 0:09:32.248 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:57:41 -0400 (0:00:00.178) 0:09:32.427 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:57:41 -0400 (0:00:00.124) 0:09:32.551 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:57:41 -0400 (0:00:00.235) 0:09:32.787 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:57:41 -0400 (0:00:00.128) 0:09:32.915 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:57:42 -0400 (0:00:00.060) 0:09:32.976 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:57:42 -0400 (0:00:00.141) 0:09:33.117 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:57:42 -0400 (0:00:00.174) 0:09:33.292 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:57:42 -0400 (0:00:00.154) 0:09:33.446 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:57:42 -0400 (0:00:00.112) 0:09:33.558 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:57:42 -0400 (0:00:00.080) 0:09:33.639 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:57:42 -0400 (0:00:00.163) 0:09:33.803 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:57:42 -0400 (0:00:00.077) 0:09:33.881 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:57:44 -0400 (0:00:01.197) 0:09:35.079 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:57:44 -0400 (0:00:00.090) 0:09:35.169 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:57:44 -0400 (0:00:00.441) 0:09:35.611 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:57:44 -0400 (0:00:00.149) 0:09:35.760 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:57:44 -0400 (0:00:00.140) 0:09:35.901 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:57:45 -0400 (0:00:00.162) 0:09:36.063 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:57:45 -0400 (0:00:00.145) 0:09:36.208 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:57:45 -0400 (0:00:00.160) 0:09:36.368 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:57:45 -0400 (0:00:00.193) 0:09:36.562 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:57:45 -0400 (0:00:00.211) 0:09:36.773 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:57:45 -0400 (0:00:00.149) 0:09:36.923 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:57:46 -0400 (0:00:00.181) 0:09:37.104 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:57:47 -0400 (0:00:00.896) 0:09:38.001 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:57:47 -0400 (0:00:00.196) 0:09:38.198 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:57:47 -0400 (0:00:00.401) 0:09:38.600 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:57:47 -0400 (0:00:00.118) 0:09:38.718 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:57:48 -0400 (0:00:00.295) 0:09:39.014 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:57:48 -0400 (0:00:00.214) 0:09:39.229 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:57:48 -0400 (0:00:00.517) 0:09:39.746 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:57:49 -0400 (0:00:00.208) 0:09:39.954 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:57:49 -0400 (0:00:00.157) 0:09:40.112 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:57:49 -0400 (0:00:00.200) 0:09:40.312 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:57:49 -0400 (0:00:00.187) 0:09:40.500 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:57:49 -0400 (0:00:00.388) 0:09:40.888 ********** skipping: [managed-node16] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:57:50 -0400 (0:00:00.199) 0:09:41.088 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:57:50 -0400 (0:00:00.483) 0:09:41.572 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:57:51 -0400 (0:00:00.360) 0:09:41.932 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:57:51 -0400 (0:00:00.187) 0:09:42.120 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:57:51 -0400 (0:00:00.137) 0:09:42.257 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:57:51 -0400 (0:00:00.211) 0:09:42.469 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:57:51 -0400 (0:00:00.270) 0:09:42.739 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:57:51 -0400 (0:00:00.139) 0:09:42.878 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:57:52 -0400 (0:00:00.133) 0:09:43.012 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:57:52 -0400 (0:00:00.141) 0:09:43.153 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:57:52 -0400 (0:00:00.214) 0:09:43.368 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:57:52 -0400 (0:00:00.277) 0:09:43.646 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:57:53 -0400 (0:00:01.269) 0:09:44.916 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:57:54 -0400 (0:00:00.138) 0:09:45.054 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:57:54 -0400 (0:00:00.147) 0:09:45.201 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:57:54 -0400 (0:00:00.273) 0:09:45.474 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:57:54 -0400 (0:00:00.211) 0:09:45.686 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:57:54 -0400 (0:00:00.167) 0:09:45.853 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:57:55 -0400 (0:00:00.160) 0:09:46.014 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:57:55 -0400 (0:00:00.201) 0:09:46.216 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:57:55 -0400 (0:00:00.094) 0:09:46.310 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:57:55 -0400 (0:00:00.080) 0:09:46.391 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:57:55 -0400 (0:00:00.121) 0:09:46.512 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:57:55 -0400 (0:00:00.073) 0:09:46.585 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:57:55 -0400 (0:00:00.290) 0:09:46.876 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:57:56 -0400 (0:00:00.243) 0:09:47.120 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:57:56 -0400 (0:00:00.122) 0:09:47.242 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:57:56 -0400 (0:00:00.197) 0:09:47.440 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:57:56 -0400 (0:00:00.197) 0:09:47.638 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:57:56 -0400 (0:00:00.106) 0:09:47.744 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:57:57 -0400 (0:00:00.185) 0:09:47.930 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:57:57 -0400 (0:00:00.242) 0:09:48.173 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715037.8103817, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715037.8103817, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 241581, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776715037.8103817, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:57:58 -0400 (0:00:01.012) 0:09:49.186 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:57:58 -0400 (0:00:00.296) 0:09:49.483 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:57:58 -0400 (0:00:00.119) 0:09:49.602 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:57:58 -0400 (0:00:00.144) 0:09:49.747 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:57:58 -0400 (0:00:00.056) 0:09:49.803 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:57:59 -0400 (0:00:00.179) 0:09:49.982 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:57:59 -0400 (0:00:00.107) 0:09:50.090 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:57:59 -0400 (0:00:00.155) 0:09:50.246 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:58:02 -0400 (0:00:03.658) 0:09:53.905 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:58:03 -0400 (0:00:00.158) 0:09:54.063 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:58:03 -0400 (0:00:00.121) 0:09:54.184 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:58:03 -0400 (0:00:00.215) 0:09:54.400 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:58:03 -0400 (0:00:00.147) 0:09:54.548 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:58:03 -0400 (0:00:00.121) 0:09:54.670 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:58:03 -0400 (0:00:00.131) 0:09:54.801 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:58:03 -0400 (0:00:00.122) 0:09:54.923 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:58:04 -0400 (0:00:00.202) 0:09:55.126 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:58:04 -0400 (0:00:00.271) 0:09:55.398 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:58:04 -0400 (0:00:00.226) 0:09:55.624 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:58:04 -0400 (0:00:00.151) 0:09:55.776 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:58:04 -0400 (0:00:00.150) 0:09:55.926 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:58:05 -0400 (0:00:00.146) 0:09:56.073 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:58:05 -0400 (0:00:00.153) 0:09:56.226 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:58:05 -0400 (0:00:00.130) 0:09:56.357 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:58:05 -0400 (0:00:00.183) 0:09:56.540 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:58:05 -0400 (0:00:00.195) 0:09:56.736 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:58:05 -0400 (0:00:00.177) 0:09:56.914 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:58:06 -0400 (0:00:00.229) 0:09:57.144 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:58:06 -0400 (0:00:00.088) 0:09:57.232 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:58:06 -0400 (0:00:00.258) 0:09:57.490 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:58:06 -0400 (0:00:00.189) 0:09:57.680 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:58:06 -0400 (0:00:00.238) 0:09:57.919 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:58:07 -0400 (0:00:00.203) 0:09:58.123 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:58:07 -0400 (0:00:00.193) 0:09:58.316 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:58:07 -0400 (0:00:00.165) 0:09:58.482 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:58:07 -0400 (0:00:00.152) 0:09:58.635 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:58:07 -0400 (0:00:00.179) 0:09:58.814 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:58:08 -0400 (0:00:00.147) 0:09:58.961 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:58:08 -0400 (0:00:00.149) 0:09:59.111 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:58:08 -0400 (0:00:00.116) 0:09:59.228 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:58:08 -0400 (0:00:00.055) 0:09:59.284 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:58:08 -0400 (0:00:00.127) 0:09:59.411 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:58:08 -0400 (0:00:00.142) 0:09:59.554 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:58:08 -0400 (0:00:00.133) 0:09:59.688 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:58:08 -0400 (0:00:00.183) 0:09:59.871 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:58:09 -0400 (0:00:00.218) 0:10:00.090 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:58:09 -0400 (0:00:00.347) 0:10:00.437 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:58:09 -0400 (0:00:00.221) 0:10:00.659 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:58:09 -0400 (0:00:00.145) 0:10:00.805 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:58:10 -0400 (0:00:00.143) 0:10:00.948 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:58:10 -0400 (0:00:00.185) 0:10:01.133 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:58:10 -0400 (0:00:00.142) 0:10:01.276 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:58:10 -0400 (0:00:00.259) 0:10:01.535 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:58:10 -0400 (0:00:00.169) 0:10:01.704 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:58:10 -0400 (0:00:00.127) 0:10:01.831 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:58:11 -0400 (0:00:00.119) 0:10:01.951 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:58:11 -0400 (0:00:00.083) 0:10:02.035 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:58:11 -0400 (0:00:00.100) 0:10:02.135 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:58:11 -0400 (0:00:00.116) 0:10:02.251 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:58:11 -0400 (0:00:00.133) 0:10:02.385 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:58:11 -0400 (0:00:00.086) 0:10:02.472 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:58:11 -0400 (0:00:00.112) 0:10:02.584 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:58:11 -0400 (0:00:00.096) 0:10:02.680 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:58:11 -0400 (0:00:00.142) 0:10:02.823 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:58:11 -0400 (0:00:00.084) 0:10:02.907 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:58:12 -0400 (0:00:00.105) 0:10:03.013 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:58:12 -0400 (0:00:00.084) 0:10:03.098 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:58:12 -0400 (0:00:00.131) 0:10:03.229 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:58:12 -0400 (0:00:00.105) 0:10:03.335 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:58:12 -0400 (0:00:00.088) 0:10:03.423 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Monday 20 April 2026 15:58:13 -0400 (0:00:01.268) 0:10:04.692 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:58:14 -0400 (0:00:00.629) 0:10:05.321 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:58:14 -0400 (0:00:00.201) 0:10:05.522 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:58:14 -0400 (0:00:00.212) 0:10:05.735 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:58:14 -0400 (0:00:00.001) 0:10:05.737 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:58:14 -0400 (0:00:00.152) 0:10:05.889 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:58:15 -0400 (0:00:00.184) 0:10:06.073 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:58:15 -0400 (0:00:00.132) 0:10:06.206 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:58:17 -0400 (0:00:02.332) 0:10:08.538 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:58:18 -0400 (0:00:01.065) 0:10:09.604 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:58:19 -0400 (0:00:00.367) 0:10:09.971 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:58:19 -0400 (0:00:00.133) 0:10:10.105 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:58:19 -0400 (0:00:00.145) 0:10:10.250 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:58:19 -0400 (0:00:00.071) 0:10:10.322 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:58:19 -0400 (0:00:00.153) 0:10:10.475 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:58:19 -0400 (0:00:00.303) 0:10:10.778 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:58:19 -0400 (0:00:00.147) 0:10:10.926 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:58:20 -0400 (0:00:00.084) 0:10:11.011 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:58:24 -0400 (0:00:04.859) 0:10:15.870 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:58:25 -0400 (0:00:00.233) 0:10:16.104 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:58:25 -0400 (0:00:00.198) 0:10:16.302 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:58:30 -0400 (0:00:05.159) 0:10:21.461 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:58:30 -0400 (0:00:00.239) 0:10:21.700 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:58:30 -0400 (0:00:00.160) 0:10:21.861 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:58:31 -0400 (0:00:00.141) 0:10:22.002 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:58:31 -0400 (0:00:00.097) 0:10:22.099 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:58:34 -0400 (0:00:03.507) 0:10:25.607 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service": { "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service": { "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:58:37 -0400 (0:00:02.938) 0:10:28.546 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d228a619c\x2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-228a619c-db48-4cff-ab7e-39c8c44696a2", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-228a619c-db48-4cff-ab7e-39c8c44696a2 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-228a619c-db48-4cff-ab7e-39c8c44696a2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:56:37 EDT", "StateChangeTimestampMonotonic": "2224477935", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:58:40 -0400 (0:00:03.132) 0:10:31.679 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:58:45 -0400 (0:00:04.579) 0:10:36.258 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:58:45 -0400 (0:00:00.049) 0:10:36.308 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d228a619c\x2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d228a619c\\x2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2ddb48\x2d4cff\x2dab7e\x2d39c8c44696a2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "name": "systemd-cryptsetup@luks…2ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62ddb48\\x2d4cff\\x2dab7e\\x2d39c8c44696a2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:58:48 -0400 (0:00:03.136) 0:10:39.444 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:58:48 -0400 (0:00:00.242) 0:10:39.687 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:58:49 -0400 (0:00:00.328) 0:10:40.015 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:58:49 -0400 (0:00:00.089) 0:10:40.104 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715093.6091406, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715093.6091406, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715093.6091406, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2322018772", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:58:50 -0400 (0:00:01.142) 0:10:41.247 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:317 Monday 20 April 2026 15:58:50 -0400 (0:00:00.175) 0:10:41.422 ********** ok: [managed-node16] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testqvrzdgdzlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:324 Monday 20 April 2026 15:58:51 -0400 (0:00:01.215) 0:10:42.637 ********** ok: [managed-node16] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testqvrzdgdzlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776715131.8804505-151443-80177370964579/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:331 Monday 20 April 2026 15:58:54 -0400 (0:00:02.554) 0:10:45.192 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 15:58:54 -0400 (0:00:00.138) 0:10:45.330 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:58:54 -0400 (0:00:00.001) 0:10:45.332 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:58:54 -0400 (0:00:00.159) 0:10:45.491 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:58:54 -0400 (0:00:00.223) 0:10:45.715 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:58:54 -0400 (0:00:00.109) 0:10:45.825 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:58:56 -0400 (0:00:01.983) 0:10:47.808 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:58:57 -0400 (0:00:00.784) 0:10:48.593 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:58:57 -0400 (0:00:00.160) 0:10:48.753 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:58:57 -0400 (0:00:00.083) 0:10:48.837 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:58:58 -0400 (0:00:00.124) 0:10:48.962 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:58:58 -0400 (0:00:00.226) 0:10:49.188 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:58:58 -0400 (0:00:00.102) 0:10:49.291 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:58:58 -0400 (0:00:00.213) 0:10:49.505 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:58:58 -0400 (0:00:00.041) 0:10:49.547 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:58:58 -0400 (0:00:00.040) 0:10:49.587 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:59:01 -0400 (0:00:03.243) 0:10:52.831 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:59:02 -0400 (0:00:00.146) 0:10:52.977 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:59:02 -0400 (0:00:00.116) 0:10:53.094 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:59:07 -0400 (0:00:05.158) 0:10:58.252 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:59:07 -0400 (0:00:00.195) 0:10:58.448 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:59:07 -0400 (0:00:00.169) 0:10:58.618 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:59:07 -0400 (0:00:00.119) 0:10:58.737 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:59:07 -0400 (0:00:00.083) 0:10:58.820 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:59:11 -0400 (0:00:03.469) 0:11:02.290 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:59:14 -0400 (0:00:02.744) 0:11:05.035 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:59:14 -0400 (0:00:00.382) 0:11:05.418 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "/tmp/storage_testqvrzdgdzlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:59:28 -0400 (0:00:13.794) 0:11:19.212 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:59:28 -0400 (0:00:00.186) 0:11:19.399 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715048.4763358, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a2799eafafb4eb09a1966bb1be5384eec54bb353", "ctime": 1776715048.4723358, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715048.4723358, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:59:29 -0400 (0:00:01.100) 0:11:20.500 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:59:30 -0400 (0:00:01.160) 0:11:21.660 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:59:31 -0400 (0:00:00.293) 0:11:21.953 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "/tmp/storage_testqvrzdgdzlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:59:31 -0400 (0:00:00.151) 0:11:22.105 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:59:31 -0400 (0:00:00.118) 0:11:22.223 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:59:31 -0400 (0:00:00.164) 0:11:22.387 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': 'UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ab4d1e50-c8a3-4cd9-bf4e-a66d8f6381ac" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:59:33 -0400 (0:00:01.581) 0:11:23.969 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:59:34 -0400 (0:00:01.471) 0:11:25.440 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:59:35 -0400 (0:00:01.267) 0:11:26.708 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:59:36 -0400 (0:00:00.247) 0:11:26.955 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:59:37 -0400 (0:00:01.532) 0:11:28.488 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715060.9222822, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715052.9033167, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 54526149, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776715052.9023166, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1716559422", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:59:38 -0400 (0:00:01.154) 0:11:29.642 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', 'password': '/tmp/storage_testqvrzdgdzlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "/tmp/storage_testqvrzdgdzlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:59:39 -0400 (0:00:01.199) 0:11:30.842 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:59:41 -0400 (0:00:02.055) 0:11:32.897 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:348 Monday 20 April 2026 15:59:43 -0400 (0:00:01.195) 0:11:34.093 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:59:43 -0400 (0:00:00.159) 0:11:34.252 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:59:43 -0400 (0:00:00.128) 0:11:34.381 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:59:43 -0400 (0:00:00.085) 0:11:34.467 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "size": "4G", "type": "crypt", "uuid": "d6537849-f66d-463e-aa88-01d2268da618" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:59:44 -0400 (0:00:00.913) 0:11:35.380 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003643", "end": "2026-04-20 15:59:46.219030", "rc": 0, "start": "2026-04-20 15:59:45.215387" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:59:46 -0400 (0:00:01.856) 0:11:37.237 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002643", "end": "2026-04-20 15:59:46.848046", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:59:46.845403" } STDOUT: luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae /dev/sda1 /tmp/storage_testqvrzdgdzlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:59:46 -0400 (0:00:00.657) 0:11:37.895 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testqvrzdgdzlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:59:47 -0400 (0:00:00.183) 0:11:38.078 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:59:47 -0400 (0:00:00.116) 0:11:38.195 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:59:47 -0400 (0:00:00.086) 0:11:38.281 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:59:47 -0400 (0:00:00.127) 0:11:38.409 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:59:47 -0400 (0:00:00.267) 0:11:38.676 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:59:47 -0400 (0:00:00.131) 0:11:38.808 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:59:47 -0400 (0:00:00.043) 0:11:38.851 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:59:47 -0400 (0:00:00.075) 0:11:38.926 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:59:48 -0400 (0:00:00.042) 0:11:38.969 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:59:48 -0400 (0:00:00.028) 0:11:38.997 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:59:48 -0400 (0:00:00.045) 0:11:39.042 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:59:48 -0400 (0:00:00.029) 0:11:39.072 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:59:48 -0400 (0:00:00.034) 0:11:39.107 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:59:48 -0400 (0:00:00.031) 0:11:39.139 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:59:48 -0400 (0:00:00.639) 0:11:39.779 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:59:48 -0400 (0:00:00.060) 0:11:39.839 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:59:49 -0400 (0:00:00.252) 0:11:40.092 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:59:49 -0400 (0:00:00.120) 0:11:40.212 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:59:49 -0400 (0:00:00.150) 0:11:40.362 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:59:49 -0400 (0:00:00.087) 0:11:40.450 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:59:49 -0400 (0:00:00.054) 0:11:40.504 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:59:49 -0400 (0:00:00.056) 0:11:40.561 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:59:49 -0400 (0:00:00.072) 0:11:40.633 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:59:49 -0400 (0:00:00.082) 0:11:40.716 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:59:49 -0400 (0:00:00.130) 0:11:40.847 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:59:50 -0400 (0:00:00.128) 0:11:40.975 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:59:50 -0400 (0:00:00.103) 0:11:41.079 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:59:50 -0400 (0:00:00.091) 0:11:41.170 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:59:50 -0400 (0:00:00.247) 0:11:41.417 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testqvrzdgdzlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:59:50 -0400 (0:00:00.238) 0:11:41.655 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:59:51 -0400 (0:00:00.323) 0:11:41.979 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testqvrzdgdzlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:59:51 -0400 (0:00:00.177) 0:11:42.156 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:59:51 -0400 (0:00:00.443) 0:11:42.600 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:59:51 -0400 (0:00:00.127) 0:11:42.728 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:59:51 -0400 (0:00:00.081) 0:11:42.809 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:59:51 -0400 (0:00:00.074) 0:11:42.883 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:59:52 -0400 (0:00:00.118) 0:11:43.002 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:59:52 -0400 (0:00:00.439) 0:11:43.441 ********** skipping: [managed-node16] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testqvrzdgdzlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testqvrzdgdzlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:59:52 -0400 (0:00:00.111) 0:11:43.553 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:59:53 -0400 (0:00:00.413) 0:11:43.967 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:59:53 -0400 (0:00:00.136) 0:11:44.103 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:59:53 -0400 (0:00:00.098) 0:11:44.202 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:59:53 -0400 (0:00:00.074) 0:11:44.276 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:59:53 -0400 (0:00:00.084) 0:11:44.361 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:59:53 -0400 (0:00:00.139) 0:11:44.500 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:59:53 -0400 (0:00:00.117) 0:11:44.618 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:59:53 -0400 (0:00:00.121) 0:11:44.740 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:59:53 -0400 (0:00:00.091) 0:11:44.831 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testqvrzdgdzlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:59:54 -0400 (0:00:00.161) 0:11:44.992 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:59:54 -0400 (0:00:00.234) 0:11:45.227 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:59:56 -0400 (0:00:02.082) 0:11:47.309 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:59:56 -0400 (0:00:00.169) 0:11:47.478 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:59:56 -0400 (0:00:00.262) 0:11:47.741 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:59:57 -0400 (0:00:00.275) 0:11:48.016 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:59:57 -0400 (0:00:00.271) 0:11:48.288 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:59:57 -0400 (0:00:00.223) 0:11:48.511 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:59:57 -0400 (0:00:00.299) 0:11:48.811 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:59:58 -0400 (0:00:00.168) 0:11:48.979 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:59:58 -0400 (0:00:00.152) 0:11:49.132 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:59:58 -0400 (0:00:00.141) 0:11:49.273 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:59:58 -0400 (0:00:00.184) 0:11:49.458 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:59:58 -0400 (0:00:00.090) 0:11:49.549 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:59:58 -0400 (0:00:00.348) 0:11:49.898 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:59:59 -0400 (0:00:00.240) 0:11:50.138 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:59:59 -0400 (0:00:00.224) 0:11:50.362 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:59:59 -0400 (0:00:00.164) 0:11:50.527 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:59:59 -0400 (0:00:00.253) 0:11:50.780 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:59:59 -0400 (0:00:00.145) 0:11:50.926 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:00:00 -0400 (0:00:00.255) 0:11:51.181 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:00:00 -0400 (0:00:00.363) 0:11:51.545 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715167.956819, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715167.956819, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260284, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776715167.956819, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:00:01 -0400 (0:00:01.157) 0:11:52.702 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:00:02 -0400 (0:00:00.226) 0:11:52.928 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:00:02 -0400 (0:00:00.127) 0:11:53.055 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:00:02 -0400 (0:00:00.256) 0:11:53.312 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:00:02 -0400 (0:00:00.169) 0:11:53.481 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:00:02 -0400 (0:00:00.128) 0:11:53.609 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:00:02 -0400 (0:00:00.175) 0:11:53.784 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715168.0928185, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715168.0928185, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 260403, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715168.0928185, "nlink": 1, "path": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:00:03 -0400 (0:00:01.012) 0:11:54.797 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:00:07 -0400 (0:00:03.387) 0:11:58.184 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009880", "end": "2026-04-20 16:00:08.436588", "rc": 0, "start": "2026-04-20 16:00:08.426708" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 937509 Threads: 2 Salt: 9a 7d b1 25 cd 57 da 34 cf 88 36 5d 4e 7a f5 35 c1 8d 9e cd 95 48 10 f0 66 85 a1 1f 87 96 7b b1 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: a2 c4 6a e8 34 f4 cb 63 91 32 03 45 41 cc a9 a2 73 36 00 c9 fa 94 39 8d 86 83 2b ad 31 56 eb d6 Digest: ae d6 43 96 bd 79 a3 ec c4 8b b3 a7 dd f7 f3 f5 c1 0f 93 c5 ad 4f 9f f4 37 3f 0e 47 eb e5 bb a9 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:00:08 -0400 (0:00:01.350) 0:11:59.535 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:00:08 -0400 (0:00:00.283) 0:11:59.818 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:00:09 -0400 (0:00:00.209) 0:12:00.028 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:00:09 -0400 (0:00:00.164) 0:12:00.193 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:00:09 -0400 (0:00:00.280) 0:12:00.473 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:00:09 -0400 (0:00:00.258) 0:12:00.731 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:00:10 -0400 (0:00:00.234) 0:12:00.966 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:00:10 -0400 (0:00:00.217) 0:12:01.184 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae /dev/sda1 /tmp/storage_testqvrzdgdzlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testqvrzdgdzlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:00:10 -0400 (0:00:00.327) 0:12:01.511 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:00:10 -0400 (0:00:00.250) 0:12:01.762 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:00:11 -0400 (0:00:00.204) 0:12:01.966 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:00:11 -0400 (0:00:00.351) 0:12:02.317 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:00:11 -0400 (0:00:00.273) 0:12:02.591 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:00:11 -0400 (0:00:00.150) 0:12:02.741 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:00:11 -0400 (0:00:00.166) 0:12:02.907 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:00:12 -0400 (0:00:00.194) 0:12:03.102 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:00:12 -0400 (0:00:00.160) 0:12:03.262 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:00:12 -0400 (0:00:00.159) 0:12:03.421 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:00:12 -0400 (0:00:00.211) 0:12:03.632 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:00:12 -0400 (0:00:00.215) 0:12:03.847 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:00:13 -0400 (0:00:00.100) 0:12:03.948 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:00:13 -0400 (0:00:00.233) 0:12:04.181 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:00:13 -0400 (0:00:00.097) 0:12:04.278 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:00:13 -0400 (0:00:00.200) 0:12:04.479 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:00:13 -0400 (0:00:00.177) 0:12:04.657 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:00:13 -0400 (0:00:00.168) 0:12:04.825 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:00:14 -0400 (0:00:00.165) 0:12:04.990 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:00:14 -0400 (0:00:00.122) 0:12:05.113 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:00:14 -0400 (0:00:00.179) 0:12:05.292 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:00:14 -0400 (0:00:00.205) 0:12:05.498 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:00:14 -0400 (0:00:00.142) 0:12:05.641 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:00:14 -0400 (0:00:00.175) 0:12:05.816 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:00:15 -0400 (0:00:00.177) 0:12:05.994 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:00:15 -0400 (0:00:00.227) 0:12:06.222 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:00:15 -0400 (0:00:00.250) 0:12:06.472 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:00:15 -0400 (0:00:00.246) 0:12:06.719 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:00:15 -0400 (0:00:00.136) 0:12:06.856 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:00:16 -0400 (0:00:00.197) 0:12:07.053 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:00:16 -0400 (0:00:00.183) 0:12:07.237 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:00:16 -0400 (0:00:00.150) 0:12:07.387 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:00:16 -0400 (0:00:00.154) 0:12:07.542 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:00:16 -0400 (0:00:00.130) 0:12:07.673 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:00:16 -0400 (0:00:00.127) 0:12:07.800 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:00:16 -0400 (0:00:00.074) 0:12:07.875 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:00:17 -0400 (0:00:00.228) 0:12:08.103 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:00:17 -0400 (0:00:00.292) 0:12:08.396 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:00:17 -0400 (0:00:00.193) 0:12:08.590 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:00:17 -0400 (0:00:00.300) 0:12:08.890 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:00:18 -0400 (0:00:00.211) 0:12:09.102 ********** ok: [managed-node16] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined. 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:00:18 -0400 (0:00:00.255) 0:12:09.357 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:00:18 -0400 (0:00:00.241) 0:12:09.599 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:00:18 -0400 (0:00:00.161) 0:12:09.761 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:00:18 -0400 (0:00:00.145) 0:12:09.906 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:00:19 -0400 (0:00:00.151) 0:12:10.058 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:00:19 -0400 (0:00:00.161) 0:12:10.220 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:00:19 -0400 (0:00:00.170) 0:12:10.390 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:00:19 -0400 (0:00:00.145) 0:12:10.536 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:00:19 -0400 (0:00:00.162) 0:12:10.699 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:00:19 -0400 (0:00:00.178) 0:12:10.878 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:00:20 -0400 (0:00:00.126) 0:12:11.005 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:351 Monday 20 April 2026 16:00:20 -0400 (0:00:00.107) 0:12:11.112 ********** ok: [managed-node16] => { "changed": false, "path": "/tmp/storage_testqvrzdgdzlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:361 Monday 20 April 2026 16:00:21 -0400 (0:00:01.307) 0:12:12.419 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:00:21 -0400 (0:00:00.205) 0:12:12.625 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:00:21 -0400 (0:00:00.209) 0:12:12.834 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:00:22 -0400 (0:00:00.148) 0:12:12.983 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:00:22 -0400 (0:00:00.001) 0:12:12.984 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:00:22 -0400 (0:00:00.289) 0:12:13.274 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:00:22 -0400 (0:00:00.215) 0:12:13.490 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:00:22 -0400 (0:00:00.374) 0:12:13.864 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:00:25 -0400 (0:00:02.386) 0:12:16.250 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:00:26 -0400 (0:00:01.165) 0:12:17.416 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:00:26 -0400 (0:00:00.442) 0:12:17.858 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:00:27 -0400 (0:00:00.272) 0:12:18.130 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:00:27 -0400 (0:00:00.191) 0:12:18.322 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:00:27 -0400 (0:00:00.164) 0:12:18.486 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:00:27 -0400 (0:00:00.186) 0:12:18.672 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:00:28 -0400 (0:00:00.554) 0:12:19.227 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:00:28 -0400 (0:00:00.232) 0:12:19.460 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:00:28 -0400 (0:00:00.192) 0:12:19.652 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:00:32 -0400 (0:00:03.612) 0:12:23.264 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:00:32 -0400 (0:00:00.239) 0:12:23.504 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:00:32 -0400 (0:00:00.178) 0:12:23.682 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:00:38 -0400 (0:00:05.446) 0:12:29.129 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:00:38 -0400 (0:00:00.274) 0:12:29.403 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:00:38 -0400 (0:00:00.136) 0:12:29.539 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:00:38 -0400 (0:00:00.113) 0:12:29.653 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:00:38 -0400 (0:00:00.182) 0:12:29.835 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:00:42 -0400 (0:00:03.389) 0:12:33.224 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:00:44 -0400 (0:00:02.448) 0:12:35.673 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:00:45 -0400 (0:00:00.426) 0:12:36.099 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:00:50 -0400 (0:00:05.170) 0:12:41.316 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:00:50 -0400 (0:00:00.290) 0:12:41.607 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:00:50 -0400 (0:00:00.173) 0:12:41.781 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:00:51 -0400 (0:00:00.185) 0:12:41.966 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:00:51 -0400 (0:00:00.327) 0:12:42.294 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:380 Monday 20 April 2026 16:00:51 -0400 (0:00:00.250) 0:12:42.545 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:00:51 -0400 (0:00:00.110) 0:12:42.656 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:00:51 -0400 (0:00:00.001) 0:12:42.658 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:00:51 -0400 (0:00:00.057) 0:12:42.715 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:00:51 -0400 (0:00:00.122) 0:12:42.838 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:00:51 -0400 (0:00:00.083) 0:12:42.921 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:00:53 -0400 (0:00:01.922) 0:12:44.844 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:00:54 -0400 (0:00:00.874) 0:12:45.718 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:00:55 -0400 (0:00:00.325) 0:12:46.043 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:00:55 -0400 (0:00:00.123) 0:12:46.167 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:00:55 -0400 (0:00:00.092) 0:12:46.260 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:00:55 -0400 (0:00:00.067) 0:12:46.327 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:00:55 -0400 (0:00:00.098) 0:12:46.426 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:00:55 -0400 (0:00:00.216) 0:12:46.643 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:00:55 -0400 (0:00:00.047) 0:12:46.690 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:00:55 -0400 (0:00:00.089) 0:12:46.780 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:00:58 -0400 (0:00:02.897) 0:12:49.677 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:00:58 -0400 (0:00:00.146) 0:12:49.824 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:00:58 -0400 (0:00:00.055) 0:12:49.879 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:01:03 -0400 (0:00:04.597) 0:12:54.477 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:01:03 -0400 (0:00:00.122) 0:12:54.599 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:01:03 -0400 (0:00:00.088) 0:12:54.687 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:01:03 -0400 (0:00:00.086) 0:12:54.774 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:01:03 -0400 (0:00:00.038) 0:12:54.813 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:01:06 -0400 (0:00:02.933) 0:12:57.747 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:01:09 -0400 (0:00:02.414) 0:13:00.162 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:01:09 -0400 (0:00:00.371) 0:13:00.534 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:01:24 -0400 (0:00:14.661) 0:13:15.196 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:01:24 -0400 (0:00:00.214) 0:13:15.410 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715175.609786, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8f1d10d493962423ea81ca38764d58a0de03fc9e", "ctime": 1776715175.606786, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715175.606786, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:01:25 -0400 (0:00:01.325) 0:13:16.736 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:01:27 -0400 (0:00:01.216) 0:13:17.953 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:01:27 -0400 (0:00:00.304) 0:13:18.257 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:01:27 -0400 (0:00:00.178) 0:13:18.436 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:01:28 -0400 (0:00:00.918) 0:13:19.354 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:01:28 -0400 (0:00:00.127) 0:13:19.482 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:01:30 -0400 (0:00:01.670) 0:13:21.153 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:01:32 -0400 (0:00:02.113) 0:13:23.267 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:01:33 -0400 (0:00:01.502) 0:13:24.769 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:01:34 -0400 (0:00:00.339) 0:13:25.109 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:01:36 -0400 (0:00:01.938) 0:13:27.047 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715186.8467374, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8619f45badc2463683d20a97e2ed0a87183b2010", "ctime": 1776715179.787768, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 216006873, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715179.786768, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "379768204", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:01:37 -0400 (0:00:01.265) 0:13:28.313 ********** changed: [managed-node16] => (item={'backing_device': '/dev/sda1', 'name': 'luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:01:40 -0400 (0:00:02.855) 0:13:31.168 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:01:42 -0400 (0:00:02.057) 0:13:33.226 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:398 Monday 20 April 2026 16:01:43 -0400 (0:00:01.631) 0:13:34.858 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:01:44 -0400 (0:00:00.242) 0:13:35.100 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:01:44 -0400 (0:00:00.315) 0:13:35.415 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:01:44 -0400 (0:00:00.105) 0:13:35.521 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "06f177c4-ae08-46dd-9417-c35fb55a6429" }, "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "size": "4G", "type": "crypt", "uuid": "7035295a-8be9-4a6a-b390-1e8fd499c4f1" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:01:45 -0400 (0:00:00.988) 0:13:36.509 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003588", "end": "2026-04-20 16:01:47.638459", "rc": 0, "start": "2026-04-20 16:01:46.634871" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:01:47 -0400 (0:00:02.216) 0:13:38.726 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002496", "end": "2026-04-20 16:01:48.593299", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:01:48.590803" } STDOUT: luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:01:48 -0400 (0:00:00.985) 0:13:39.712 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:01:49 -0400 (0:00:00.297) 0:13:40.009 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:01:49 -0400 (0:00:00.100) 0:13:40.109 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.026528", "end": "2026-04-20 16:01:50.083070", "rc": 0, "start": "2026-04-20 16:01:50.056542" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:01:50 -0400 (0:00:01.028) 0:13:41.138 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:01:50 -0400 (0:00:00.250) 0:13:41.388 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:01:50 -0400 (0:00:00.174) 0:13:41.563 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:01:50 -0400 (0:00:00.199) 0:13:41.763 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:01:52 -0400 (0:00:02.069) 0:13:43.833 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:01:53 -0400 (0:00:00.255) 0:13:44.088 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:01:53 -0400 (0:00:00.195) 0:13:44.284 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:01:53 -0400 (0:00:00.198) 0:13:44.483 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:01:53 -0400 (0:00:00.129) 0:13:44.613 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:01:53 -0400 (0:00:00.197) 0:13:44.811 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:01:54 -0400 (0:00:00.195) 0:13:45.006 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:01:54 -0400 (0:00:00.134) 0:13:45.141 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:01:55 -0400 (0:00:01.150) 0:13:46.291 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "grow_supported.rc == 0", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:01:55 -0400 (0:00:00.211) 0:13:46.503 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:01:55 -0400 (0:00:00.404) 0:13:46.907 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:01:56 -0400 (0:00:00.180) 0:13:47.088 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:01:56 -0400 (0:00:00.251) 0:13:47.340 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:01:56 -0400 (0:00:00.141) 0:13:47.482 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:01:56 -0400 (0:00:00.144) 0:13:47.627 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:01:56 -0400 (0:00:00.277) 0:13:47.904 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:01:57 -0400 (0:00:00.166) 0:13:48.071 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:01:57 -0400 (0:00:00.209) 0:13:48.280 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:01:57 -0400 (0:00:00.172) 0:13:48.453 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:01:57 -0400 (0:00:00.170) 0:13:48.624 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:01:57 -0400 (0:00:00.143) 0:13:48.767 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:01:58 -0400 (0:00:00.165) 0:13:48.933 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:01:58 -0400 (0:00:00.323) 0:13:49.256 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:01:58 -0400 (0:00:00.401) 0:13:49.658 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:01:58 -0400 (0:00:00.138) 0:13:49.796 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:01:59 -0400 (0:00:00.236) 0:13:50.033 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:01:59 -0400 (0:00:00.307) 0:13:50.340 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:01:59 -0400 (0:00:00.185) 0:13:50.525 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:01:59 -0400 (0:00:00.231) 0:13:50.757 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:02:00 -0400 (0:00:00.230) 0:13:50.987 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:02:00 -0400 (0:00:00.062) 0:13:51.049 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:02:00 -0400 (0:00:00.311) 0:13:51.361 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:02:00 -0400 (0:00:00.232) 0:13:51.593 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:02:00 -0400 (0:00:00.217) 0:13:51.811 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:02:01 -0400 (0:00:00.163) 0:13:51.974 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:02:01 -0400 (0:00:00.138) 0:13:52.113 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:02:01 -0400 (0:00:00.185) 0:13:52.299 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:02:01 -0400 (0:00:00.219) 0:13:52.518 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:02:01 -0400 (0:00:00.170) 0:13:52.688 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:02:01 -0400 (0:00:00.183) 0:13:52.872 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:02:02 -0400 (0:00:00.247) 0:13:53.120 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:02:02 -0400 (0:00:00.261) 0:13:53.381 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:02:02 -0400 (0:00:00.297) 0:13:53.679 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:02:02 -0400 (0:00:00.185) 0:13:53.864 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:02:03 -0400 (0:00:00.157) 0:13:54.022 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:02:03 -0400 (0:00:00.215) 0:13:54.237 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:02:03 -0400 (0:00:00.224) 0:13:54.462 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:02:03 -0400 (0:00:00.083) 0:13:54.545 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:02:03 -0400 (0:00:00.380) 0:13:54.926 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:02:04 -0400 (0:00:00.212) 0:13:55.139 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:02:04 -0400 (0:00:00.239) 0:13:55.378 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:02:04 -0400 (0:00:00.123) 0:13:55.502 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:02:04 -0400 (0:00:00.151) 0:13:55.653 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:02:04 -0400 (0:00:00.145) 0:13:55.798 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:02:04 -0400 (0:00:00.088) 0:13:55.886 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:02:05 -0400 (0:00:00.121) 0:13:56.008 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:02:05 -0400 (0:00:00.136) 0:13:56.145 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:02:05 -0400 (0:00:00.438) 0:13:56.583 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:02:05 -0400 (0:00:00.221) 0:13:56.805 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:02:06 -0400 (0:00:00.231) 0:13:57.036 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:02:06 -0400 (0:00:00.161) 0:13:57.197 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:02:06 -0400 (0:00:00.130) 0:13:57.328 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:02:06 -0400 (0:00:00.219) 0:13:57.547 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:02:06 -0400 (0:00:00.110) 0:13:57.658 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:02:06 -0400 (0:00:00.080) 0:13:57.738 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:02:06 -0400 (0:00:00.122) 0:13:57.861 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:02:07 -0400 (0:00:00.182) 0:13:58.043 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:02:07 -0400 (0:00:00.179) 0:13:58.223 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:02:08 -0400 (0:00:01.326) 0:13:59.549 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:02:08 -0400 (0:00:00.209) 0:13:59.759 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:02:09 -0400 (0:00:00.320) 0:14:00.080 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:02:09 -0400 (0:00:00.402) 0:14:00.482 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:02:09 -0400 (0:00:00.249) 0:14:00.732 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:02:10 -0400 (0:00:00.237) 0:14:00.969 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:02:10 -0400 (0:00:00.192) 0:14:01.162 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:02:10 -0400 (0:00:00.323) 0:14:01.486 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:02:10 -0400 (0:00:00.194) 0:14:01.680 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:02:10 -0400 (0:00:00.118) 0:14:01.799 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:02:11 -0400 (0:00:00.213) 0:14:02.013 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:02:11 -0400 (0:00:00.171) 0:14:02.185 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:02:11 -0400 (0:00:00.416) 0:14:02.602 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:02:11 -0400 (0:00:00.196) 0:14:02.799 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:02:12 -0400 (0:00:00.231) 0:14:03.030 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:02:12 -0400 (0:00:00.136) 0:14:03.167 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:02:12 -0400 (0:00:00.260) 0:14:03.428 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:02:12 -0400 (0:00:00.160) 0:14:03.588 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:02:12 -0400 (0:00:00.282) 0:14:03.870 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:02:13 -0400 (0:00:00.292) 0:14:04.163 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715283.857318, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715283.857318, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 276233, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715283.857318, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:02:14 -0400 (0:00:01.164) 0:14:05.327 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:02:14 -0400 (0:00:00.096) 0:14:05.423 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:02:14 -0400 (0:00:00.163) 0:14:05.586 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:02:14 -0400 (0:00:00.199) 0:14:05.786 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:02:14 -0400 (0:00:00.136) 0:14:05.923 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:02:15 -0400 (0:00:00.205) 0:14:06.128 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:02:15 -0400 (0:00:00.200) 0:14:06.328 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715283.9973173, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715283.9973173, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 276389, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715283.9973173, "nlink": 1, "path": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:02:16 -0400 (0:00:01.286) 0:14:07.615 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:02:20 -0400 (0:00:03.440) 0:14:11.055 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010831", "end": "2026-04-20 16:02:21.388602", "rc": 0, "start": "2026-04-20 16:02:21.377771" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 06f177c4-ae08-46dd-9417-c35fb55a6429 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: d0 20 4c 39 55 df d9 d6 8c 54 1a 92 51 ed d2 39 96 f6 8e 0e 9f c6 04 2f cf 3b 9f 5f ed 08 8f a4 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 6f 9b 10 39 0c db 01 79 f6 72 05 25 f0 a5 6f 60 cb d4 60 c9 4f f5 85 46 14 29 a5 64 60 3a 5d 8b Digest: 2c 4e 25 83 a9 1d 85 f9 81 0b 79 0b 09 ec d4 88 61 92 d4 51 86 7a 41 dd 4a c6 da 26 6d 23 fc 63 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:02:21 -0400 (0:00:01.408) 0:14:12.464 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:02:21 -0400 (0:00:00.336) 0:14:12.801 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:02:22 -0400 (0:00:00.277) 0:14:13.078 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:02:22 -0400 (0:00:00.200) 0:14:13.279 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:02:22 -0400 (0:00:00.283) 0:14:13.563 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:02:22 -0400 (0:00:00.327) 0:14:13.891 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:02:23 -0400 (0:00:00.474) 0:14:14.365 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:02:23 -0400 (0:00:00.514) 0:14:14.880 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:02:24 -0400 (0:00:00.276) 0:14:15.157 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:02:24 -0400 (0:00:00.332) 0:14:15.489 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:02:24 -0400 (0:00:00.282) 0:14:15.772 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:02:25 -0400 (0:00:00.355) 0:14:16.127 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:02:25 -0400 (0:00:00.305) 0:14:16.432 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:02:25 -0400 (0:00:00.088) 0:14:16.521 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:02:25 -0400 (0:00:00.216) 0:14:16.737 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:02:26 -0400 (0:00:00.191) 0:14:16.929 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:02:26 -0400 (0:00:00.164) 0:14:17.093 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:02:26 -0400 (0:00:00.167) 0:14:17.261 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:02:26 -0400 (0:00:00.193) 0:14:17.454 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:02:26 -0400 (0:00:00.214) 0:14:17.669 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:02:26 -0400 (0:00:00.185) 0:14:17.854 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:02:27 -0400 (0:00:00.225) 0:14:18.079 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:02:27 -0400 (0:00:00.154) 0:14:18.234 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:02:27 -0400 (0:00:00.207) 0:14:18.441 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:02:30 -0400 (0:00:02.834) 0:14:21.275 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:02:31 -0400 (0:00:00.935) 0:14:22.211 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:02:31 -0400 (0:00:00.103) 0:14:22.315 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:02:31 -0400 (0:00:00.080) 0:14:22.395 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:02:32 -0400 (0:00:00.756) 0:14:23.152 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:02:32 -0400 (0:00:00.250) 0:14:23.402 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:02:32 -0400 (0:00:00.195) 0:14:23.598 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:02:33 -0400 (0:00:00.388) 0:14:23.986 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:02:33 -0400 (0:00:00.076) 0:14:24.063 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:02:33 -0400 (0:00:00.092) 0:14:24.156 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:02:33 -0400 (0:00:00.104) 0:14:24.261 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:02:33 -0400 (0:00:00.038) 0:14:24.299 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:02:33 -0400 (0:00:00.060) 0:14:24.360 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:02:33 -0400 (0:00:00.168) 0:14:24.528 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:02:33 -0400 (0:00:00.052) 0:14:24.581 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:02:33 -0400 (0:00:00.076) 0:14:24.658 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:02:33 -0400 (0:00:00.047) 0:14:24.705 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:02:33 -0400 (0:00:00.073) 0:14:24.779 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:02:33 -0400 (0:00:00.070) 0:14:24.849 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:02:34 -0400 (0:00:00.123) 0:14:24.973 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:02:34 -0400 (0:00:00.123) 0:14:25.097 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:02:34 -0400 (0:00:00.085) 0:14:25.182 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:02:34 -0400 (0:00:00.192) 0:14:25.375 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:02:34 -0400 (0:00:00.230) 0:14:25.606 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:02:34 -0400 (0:00:00.173) 0:14:25.779 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:02:35 -0400 (0:00:00.209) 0:14:25.989 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:02:35 -0400 (0:00:00.286) 0:14:26.275 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023787", "end": "2026-04-20 16:02:36.630014", "rc": 0, "start": "2026-04-20 16:02:36.606227" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:02:36 -0400 (0:00:01.474) 0:14:27.750 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:02:37 -0400 (0:00:00.181) 0:14:27.931 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:02:37 -0400 (0:00:00.152) 0:14:28.083 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:02:37 -0400 (0:00:00.126) 0:14:28.210 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:02:37 -0400 (0:00:00.146) 0:14:28.356 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:02:37 -0400 (0:00:00.224) 0:14:28.580 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:02:37 -0400 (0:00:00.220) 0:14:28.801 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:02:37 -0400 (0:00:00.100) 0:14:28.902 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:02:38 -0400 (0:00:00.114) 0:14:29.016 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:401 Monday 20 April 2026 16:02:38 -0400 (0:00:00.146) 0:14:29.163 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:02:38 -0400 (0:00:00.259) 0:14:29.423 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:02:38 -0400 (0:00:00.001) 0:14:29.425 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:02:38 -0400 (0:00:00.138) 0:14:29.563 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:02:38 -0400 (0:00:00.234) 0:14:29.798 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:02:38 -0400 (0:00:00.109) 0:14:29.907 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:02:40 -0400 (0:00:01.829) 0:14:31.737 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:02:41 -0400 (0:00:01.148) 0:14:32.886 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:02:42 -0400 (0:00:00.439) 0:14:33.325 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:02:42 -0400 (0:00:00.215) 0:14:33.540 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:02:42 -0400 (0:00:00.063) 0:14:33.603 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:02:42 -0400 (0:00:00.067) 0:14:33.670 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:02:42 -0400 (0:00:00.113) 0:14:33.783 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:02:43 -0400 (0:00:00.276) 0:14:34.060 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:02:43 -0400 (0:00:00.222) 0:14:34.290 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:02:43 -0400 (0:00:00.214) 0:14:34.505 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:02:47 -0400 (0:00:03.564) 0:14:38.069 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:02:47 -0400 (0:00:00.267) 0:14:38.336 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:02:47 -0400 (0:00:00.178) 0:14:38.515 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:02:53 -0400 (0:00:05.547) 0:14:44.062 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:02:53 -0400 (0:00:00.260) 0:14:44.322 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:02:53 -0400 (0:00:00.136) 0:14:44.459 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:02:53 -0400 (0:00:00.113) 0:14:44.572 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:02:53 -0400 (0:00:00.112) 0:14:44.685 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:02:56 -0400 (0:00:03.174) 0:14:47.860 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service": { "name": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service": { "name": "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:02:59 -0400 (0:00:02.252) 0:14:50.113 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2dfa1bdaaa\x2d25c3\x2d4bcc\x2d8f4d\x2de52e264911ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "name": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-sda1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice tmp.mount -.mount", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae /dev/sda1 /tmp/storage_testqvrzdgdzlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-fa1bdaaa-25c3-4bcc-8f4d-e52e264911ae ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testqvrzdgdzlukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:01:35 EDT", "StateChangeTimestampMonotonic": "2522931840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d25c3\x2d4bcc\x2d8f4d\x2de52e264911ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "name": "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:03:02 -0400 (0:00:03.026) 0:14:53.139 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:03:07 -0400 (0:00:05.125) 0:14:58.265 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:03:07 -0400 (0:00:00.091) 0:14:58.357 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715293.6522753, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "59f8a385bd70192d99cab00c0dbc89763b236293", "ctime": 1776715293.6492755, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715293.6492755, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:03:08 -0400 (0:00:01.270) 0:14:59.627 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:03:08 -0400 (0:00:00.217) 0:14:59.845 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2dfa1bdaaa\x2d25c3\x2d4bcc\x2d8f4d\x2de52e264911ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "name": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dfa1bdaaa\\x2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2d25c3\x2d4bcc\x2d8f4d\x2de52e264911ae.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "name": "systemd-cryptsetup@luks…2d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62d25c3\\x2d4bcc\\x2d8f4d\\x2de52e264911ae.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:03:12 -0400 (0:00:03.484) 0:15:03.330 ********** ok: [managed-node16] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:03:12 -0400 (0:00:00.169) 0:15:03.499 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:03:12 -0400 (0:00:00.126) 0:15:03.626 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:03:12 -0400 (0:00:00.241) 0:15:03.867 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:03:13 -0400 (0:00:00.193) 0:15:04.061 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:03:14 -0400 (0:00:01.654) 0:15:05.716 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node16] => (item={'src': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:03:16 -0400 (0:00:01.213) 0:15:06.929 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:03:16 -0400 (0:00:00.184) 0:15:07.113 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:03:17 -0400 (0:00:01.467) 0:15:08.581 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715308.5922108, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "508b4188162bbeb4b7002335ff74a84a1c673016", "ctime": 1776715300.0292478, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 358613126, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715300.0282478, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "795629973", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:03:18 -0400 (0:00:01.257) 0:15:09.838 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:03:18 -0400 (0:00:00.075) 0:15:09.914 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:03:20 -0400 (0:00:01.905) 0:15:11.819 ********** ok: [managed-node16] => { "changed": false } TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:414 Monday 20 April 2026 16:03:22 -0400 (0:00:01.402) 0:15:13.221 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:421 Monday 20 April 2026 16:03:22 -0400 (0:00:00.287) 0:15:13.509 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:03:22 -0400 (0:00:00.411) 0:15:13.920 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:03:23 -0400 (0:00:00.277) 0:15:14.198 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:03:23 -0400 (0:00:00.195) 0:15:14.394 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "06f177c4-ae08-46dd-9417-c35fb55a6429" }, "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "size": "4G", "type": "crypt", "uuid": "7035295a-8be9-4a6a-b390-1e8fd499c4f1" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:03:24 -0400 (0:00:01.407) 0:15:15.802 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002469", "end": "2026-04-20 16:03:25.971757", "rc": 0, "start": "2026-04-20 16:03:25.969288" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:03:26 -0400 (0:00:01.304) 0:15:17.107 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002348", "end": "2026-04-20 16:03:27.280301", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:03:27.277953" } STDOUT: luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:03:27 -0400 (0:00:01.253) 0:15:18.360 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:03:27 -0400 (0:00:00.393) 0:15:18.754 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:03:27 -0400 (0:00:00.067) 0:15:18.822 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022324", "end": "2026-04-20 16:03:28.641030", "rc": 0, "start": "2026-04-20 16:03:28.618706" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:03:28 -0400 (0:00:00.940) 0:15:19.763 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:03:29 -0400 (0:00:00.227) 0:15:19.990 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:03:29 -0400 (0:00:00.265) 0:15:20.255 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:03:29 -0400 (0:00:00.270) 0:15:20.526 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:03:30 -0400 (0:00:01.208) 0:15:21.735 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:03:30 -0400 (0:00:00.146) 0:15:21.881 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:03:31 -0400 (0:00:00.293) 0:15:22.174 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:03:31 -0400 (0:00:00.234) 0:15:22.409 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:03:31 -0400 (0:00:00.139) 0:15:22.549 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:03:31 -0400 (0:00:00.234) 0:15:22.784 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:03:32 -0400 (0:00:00.270) 0:15:23.054 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:03:32 -0400 (0:00:00.179) 0:15:23.233 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:03:33 -0400 (0:00:01.200) 0:15:24.434 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "grow_supported.rc == 0", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:03:33 -0400 (0:00:00.166) 0:15:24.600 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:03:34 -0400 (0:00:00.334) 0:15:24.935 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:03:34 -0400 (0:00:00.139) 0:15:25.074 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:03:34 -0400 (0:00:00.106) 0:15:25.181 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:03:34 -0400 (0:00:00.126) 0:15:25.307 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:03:34 -0400 (0:00:00.141) 0:15:25.449 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:03:34 -0400 (0:00:00.087) 0:15:25.537 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:03:34 -0400 (0:00:00.119) 0:15:25.656 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:03:34 -0400 (0:00:00.084) 0:15:25.740 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:03:34 -0400 (0:00:00.150) 0:15:25.891 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:03:35 -0400 (0:00:00.177) 0:15:26.068 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:03:35 -0400 (0:00:00.175) 0:15:26.244 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:03:35 -0400 (0:00:00.133) 0:15:26.377 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:03:35 -0400 (0:00:00.234) 0:15:26.612 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:03:35 -0400 (0:00:00.236) 0:15:26.848 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:03:35 -0400 (0:00:00.060) 0:15:26.909 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:03:36 -0400 (0:00:00.070) 0:15:26.979 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:03:36 -0400 (0:00:00.043) 0:15:27.023 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:03:36 -0400 (0:00:00.072) 0:15:27.095 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:03:36 -0400 (0:00:00.046) 0:15:27.142 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:03:36 -0400 (0:00:00.103) 0:15:27.245 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:03:36 -0400 (0:00:00.095) 0:15:27.341 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:03:36 -0400 (0:00:00.244) 0:15:27.585 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:03:36 -0400 (0:00:00.068) 0:15:27.654 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:03:36 -0400 (0:00:00.047) 0:15:27.702 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:03:36 -0400 (0:00:00.079) 0:15:27.781 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:03:36 -0400 (0:00:00.146) 0:15:27.928 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:03:37 -0400 (0:00:00.104) 0:15:28.032 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:03:37 -0400 (0:00:00.199) 0:15:28.231 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:03:37 -0400 (0:00:00.108) 0:15:28.340 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:03:37 -0400 (0:00:00.218) 0:15:28.558 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:03:37 -0400 (0:00:00.252) 0:15:28.811 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:03:37 -0400 (0:00:00.100) 0:15:28.911 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:03:38 -0400 (0:00:00.172) 0:15:29.083 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:03:38 -0400 (0:00:00.074) 0:15:29.158 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:03:38 -0400 (0:00:00.107) 0:15:29.265 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:03:38 -0400 (0:00:00.144) 0:15:29.409 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:03:38 -0400 (0:00:00.175) 0:15:29.585 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:03:38 -0400 (0:00:00.147) 0:15:29.732 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:03:39 -0400 (0:00:00.212) 0:15:29.944 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:03:39 -0400 (0:00:00.165) 0:15:30.110 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:03:39 -0400 (0:00:00.167) 0:15:30.278 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:03:39 -0400 (0:00:00.081) 0:15:30.359 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:03:39 -0400 (0:00:00.109) 0:15:30.469 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:03:39 -0400 (0:00:00.141) 0:15:30.610 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:03:39 -0400 (0:00:00.204) 0:15:30.814 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:03:39 -0400 (0:00:00.070) 0:15:30.885 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:03:40 -0400 (0:00:00.231) 0:15:31.117 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:03:40 -0400 (0:00:00.271) 0:15:31.388 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:03:40 -0400 (0:00:00.120) 0:15:31.508 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:03:40 -0400 (0:00:00.085) 0:15:31.593 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:03:40 -0400 (0:00:00.294) 0:15:31.888 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:03:41 -0400 (0:00:00.074) 0:15:31.962 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:03:41 -0400 (0:00:00.143) 0:15:32.106 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:03:41 -0400 (0:00:00.096) 0:15:32.202 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:03:41 -0400 (0:00:00.149) 0:15:32.352 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:03:41 -0400 (0:00:00.246) 0:15:32.598 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:03:41 -0400 (0:00:00.258) 0:15:32.857 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:03:42 -0400 (0:00:00.278) 0:15:33.136 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:03:43 -0400 (0:00:01.153) 0:15:34.289 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:03:43 -0400 (0:00:00.325) 0:15:34.615 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:03:44 -0400 (0:00:00.433) 0:15:35.048 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:03:44 -0400 (0:00:00.124) 0:15:35.173 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:03:44 -0400 (0:00:00.292) 0:15:35.465 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:03:44 -0400 (0:00:00.389) 0:15:35.854 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:03:45 -0400 (0:00:00.387) 0:15:36.242 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:03:45 -0400 (0:00:00.305) 0:15:36.547 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:03:45 -0400 (0:00:00.195) 0:15:36.743 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:03:46 -0400 (0:00:00.186) 0:15:36.929 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:03:46 -0400 (0:00:00.185) 0:15:37.115 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:03:46 -0400 (0:00:00.135) 0:15:37.250 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:03:46 -0400 (0:00:00.588) 0:15:37.839 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:03:47 -0400 (0:00:00.199) 0:15:38.038 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:03:47 -0400 (0:00:00.337) 0:15:38.376 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:03:47 -0400 (0:00:00.131) 0:15:38.508 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:03:47 -0400 (0:00:00.150) 0:15:38.659 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:03:47 -0400 (0:00:00.115) 0:15:38.774 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:03:48 -0400 (0:00:00.220) 0:15:38.995 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:03:48 -0400 (0:00:00.128) 0:15:39.124 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715341.3830693, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715283.857318, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 276233, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715283.857318, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:03:49 -0400 (0:00:01.069) 0:15:40.194 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:03:49 -0400 (0:00:00.082) 0:15:40.277 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:03:49 -0400 (0:00:00.130) 0:15:40.407 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:03:49 -0400 (0:00:00.142) 0:15:40.549 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:03:49 -0400 (0:00:00.113) 0:15:40.663 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:03:49 -0400 (0:00:00.146) 0:15:40.810 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:03:50 -0400 (0:00:00.239) 0:15:41.049 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715387.1438718, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715283.9973173, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 276389, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715283.9973173, "nlink": 1, "path": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:03:51 -0400 (0:00:01.351) 0:15:42.402 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:03:55 -0400 (0:00:03.662) 0:15:46.064 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009899", "end": "2026-04-20 16:03:55.779082", "rc": 0, "start": "2026-04-20 16:03:55.769183" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 06f177c4-ae08-46dd-9417-c35fb55a6429 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: d0 20 4c 39 55 df d9 d6 8c 54 1a 92 51 ed d2 39 96 f6 8e 0e 9f c6 04 2f cf 3b 9f 5f ed 08 8f a4 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 6f 9b 10 39 0c db 01 79 f6 72 05 25 f0 a5 6f 60 cb d4 60 c9 4f f5 85 46 14 29 a5 64 60 3a 5d 8b Digest: 2c 4e 25 83 a9 1d 85 f9 81 0b 79 0b 09 ec d4 88 61 92 d4 51 86 7a 41 dd 4a c6 da 26 6d 23 fc 63 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:03:55 -0400 (0:00:00.768) 0:15:46.832 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:03:56 -0400 (0:00:00.135) 0:15:46.968 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:03:56 -0400 (0:00:00.169) 0:15:47.137 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:03:56 -0400 (0:00:00.147) 0:15:47.285 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:03:56 -0400 (0:00:00.153) 0:15:47.438 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:03:56 -0400 (0:00:00.176) 0:15:47.614 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:03:56 -0400 (0:00:00.146) 0:15:47.761 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:03:57 -0400 (0:00:00.172) 0:15:47.933 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:03:57 -0400 (0:00:00.295) 0:15:48.229 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:03:57 -0400 (0:00:00.228) 0:15:48.458 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:03:57 -0400 (0:00:00.185) 0:15:48.644 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:03:57 -0400 (0:00:00.209) 0:15:48.853 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:03:58 -0400 (0:00:00.151) 0:15:49.005 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:03:58 -0400 (0:00:00.099) 0:15:49.105 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:03:58 -0400 (0:00:00.105) 0:15:49.210 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:03:58 -0400 (0:00:00.096) 0:15:49.307 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:03:58 -0400 (0:00:00.132) 0:15:49.439 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:03:58 -0400 (0:00:00.067) 0:15:49.507 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:03:58 -0400 (0:00:00.073) 0:15:49.580 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:03:58 -0400 (0:00:00.087) 0:15:49.668 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:03:58 -0400 (0:00:00.094) 0:15:49.763 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:03:58 -0400 (0:00:00.087) 0:15:49.851 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:03:59 -0400 (0:00:00.077) 0:15:49.928 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:03:59 -0400 (0:00:00.147) 0:15:50.076 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:04:00 -0400 (0:00:00.864) 0:15:50.940 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:04:01 -0400 (0:00:01.009) 0:15:51.950 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:04:01 -0400 (0:00:00.269) 0:15:52.219 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:04:01 -0400 (0:00:00.223) 0:15:52.443 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:04:02 -0400 (0:00:01.291) 0:15:53.735 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:04:02 -0400 (0:00:00.176) 0:15:53.911 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:04:03 -0400 (0:00:00.177) 0:15:54.088 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:04:03 -0400 (0:00:00.205) 0:15:54.294 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:04:03 -0400 (0:00:00.188) 0:15:54.483 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:04:03 -0400 (0:00:00.254) 0:15:54.738 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:04:04 -0400 (0:00:00.294) 0:15:55.032 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:04:04 -0400 (0:00:00.188) 0:15:55.220 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:04:04 -0400 (0:00:00.273) 0:15:55.493 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:04:04 -0400 (0:00:00.275) 0:15:55.769 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:04:05 -0400 (0:00:00.306) 0:15:56.076 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:04:05 -0400 (0:00:00.312) 0:15:56.389 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:04:05 -0400 (0:00:00.235) 0:15:56.624 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:04:05 -0400 (0:00:00.230) 0:15:56.855 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:04:06 -0400 (0:00:00.240) 0:15:57.095 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:04:06 -0400 (0:00:00.225) 0:15:57.321 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:04:06 -0400 (0:00:00.166) 0:15:57.488 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:04:06 -0400 (0:00:00.262) 0:15:57.750 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:04:07 -0400 (0:00:00.190) 0:15:57.941 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:04:07 -0400 (0:00:00.316) 0:15:58.257 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:04:07 -0400 (0:00:00.168) 0:15:58.426 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:04:07 -0400 (0:00:00.166) 0:15:58.593 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:04:07 -0400 (0:00:00.155) 0:15:58.748 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025475", "end": "2026-04-20 16:04:08.967793", "rc": 0, "start": "2026-04-20 16:04:08.942318" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:04:09 -0400 (0:00:01.357) 0:16:00.106 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:04:09 -0400 (0:00:00.390) 0:16:00.496 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:04:09 -0400 (0:00:00.222) 0:16:00.718 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:04:09 -0400 (0:00:00.174) 0:16:00.893 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:04:10 -0400 (0:00:00.191) 0:16:01.084 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:04:10 -0400 (0:00:00.106) 0:16:01.191 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:04:10 -0400 (0:00:00.323) 0:16:01.515 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:04:10 -0400 (0:00:00.126) 0:16:01.641 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:04:10 -0400 (0:00:00.221) 0:16:01.862 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 16:04:11 -0400 (0:00:00.156) 0:16:02.019 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:427 Monday 20 April 2026 16:04:12 -0400 (0:00:01.256) 0:16:03.275 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:04:12 -0400 (0:00:00.372) 0:16:03.648 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:04:12 -0400 (0:00:00.182) 0:16:03.831 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:04:13 -0400 (0:00:00.137) 0:16:03.968 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:04:13 -0400 (0:00:00.082) 0:16:04.050 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:04:13 -0400 (0:00:00.126) 0:16:04.177 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:04:13 -0400 (0:00:00.282) 0:16:04.460 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:04:13 -0400 (0:00:00.135) 0:16:04.596 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:04:15 -0400 (0:00:01.855) 0:16:06.452 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:04:16 -0400 (0:00:01.289) 0:16:07.742 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:04:17 -0400 (0:00:00.459) 0:16:08.202 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:04:17 -0400 (0:00:00.211) 0:16:08.413 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:04:17 -0400 (0:00:00.187) 0:16:08.600 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:04:17 -0400 (0:00:00.134) 0:16:08.735 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:04:17 -0400 (0:00:00.150) 0:16:08.886 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:04:18 -0400 (0:00:00.439) 0:16:09.325 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:04:18 -0400 (0:00:00.207) 0:16:09.533 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:04:18 -0400 (0:00:00.190) 0:16:09.723 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:04:22 -0400 (0:00:03.432) 0:16:13.156 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:04:22 -0400 (0:00:00.187) 0:16:13.343 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:04:22 -0400 (0:00:00.178) 0:16:13.522 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:04:27 -0400 (0:00:05.096) 0:16:18.619 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:04:28 -0400 (0:00:00.388) 0:16:19.007 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:04:28 -0400 (0:00:00.130) 0:16:19.138 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:04:28 -0400 (0:00:00.138) 0:16:19.276 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:04:28 -0400 (0:00:00.145) 0:16:19.422 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:04:31 -0400 (0:00:03.427) 0:16:22.850 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:04:34 -0400 (0:00:02.299) 0:16:25.150 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:02 EDT", "StateChangeTimestampMonotonic": "2609073227", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:04:37 -0400 (0:00:02.926) 0:16:28.076 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:04:42 -0400 (0:00:05.596) 0:16:33.673 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:04:42 -0400 (0:00:00.122) 0:16:33.795 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:02 EDT", "StateChangeTimestampMonotonic": "2609073227", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:04:44 -0400 (0:00:02.059) 0:16:35.855 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:04:44 -0400 (0:00:00.025) 0:16:35.881 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:04:44 -0400 (0:00:00.044) 0:16:35.926 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 16:04:45 -0400 (0:00:00.046) 0:16:35.973 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715452.1865911, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715452.1865911, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715452.1865911, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2786761317", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 16:04:45 -0400 (0:00:00.532) 0:16:36.505 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:451 Monday 20 April 2026 16:04:45 -0400 (0:00:00.068) 0:16:36.573 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:04:45 -0400 (0:00:00.154) 0:16:36.728 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:04:45 -0400 (0:00:00.009) 0:16:36.738 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:04:45 -0400 (0:00:00.060) 0:16:36.798 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:04:45 -0400 (0:00:00.061) 0:16:36.859 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:04:46 -0400 (0:00:00.100) 0:16:36.960 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:04:47 -0400 (0:00:01.196) 0:16:38.156 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:04:47 -0400 (0:00:00.668) 0:16:38.825 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:04:48 -0400 (0:00:00.168) 0:16:38.994 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:04:48 -0400 (0:00:00.054) 0:16:39.049 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:04:48 -0400 (0:00:00.059) 0:16:39.108 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:04:48 -0400 (0:00:00.038) 0:16:39.147 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:04:48 -0400 (0:00:00.043) 0:16:39.190 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:04:48 -0400 (0:00:00.093) 0:16:39.284 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:04:48 -0400 (0:00:00.036) 0:16:39.320 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:04:48 -0400 (0:00:00.050) 0:16:39.371 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:04:51 -0400 (0:00:02.961) 0:16:42.333 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:04:51 -0400 (0:00:00.117) 0:16:42.450 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:04:51 -0400 (0:00:00.177) 0:16:42.628 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:04:56 -0400 (0:00:04.926) 0:16:47.554 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:04:56 -0400 (0:00:00.094) 0:16:47.649 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:04:56 -0400 (0:00:00.075) 0:16:47.724 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:04:56 -0400 (0:00:00.080) 0:16:47.805 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:04:56 -0400 (0:00:00.048) 0:16:47.853 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:04:59 -0400 (0:00:02.880) 0:16:50.734 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:05:02 -0400 (0:00:02.470) 0:16:53.204 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:02 EDT", "StateChangeTimestampMonotonic": "2609073227", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:05:06 -0400 (0:00:04.340) 0:16:57.545 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:05:12 -0400 (0:00:06.065) 0:17:03.611 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:05:12 -0400 (0:00:00.208) 0:17:03.820 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715293.6522753, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "59f8a385bd70192d99cab00c0dbc89763b236293", "ctime": 1776715293.6492755, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715293.6492755, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:05:14 -0400 (0:00:01.151) 0:17:04.972 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:05:15 -0400 (0:00:01.522) 0:17:06.495 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:02 EDT", "StateChangeTimestampMonotonic": "2609073227", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:05:18 -0400 (0:00:03.114) 0:17:09.609 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:05:18 -0400 (0:00:00.179) 0:17:09.789 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:05:19 -0400 (0:00:00.223) 0:17:10.012 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:05:19 -0400 (0:00:00.064) 0:17:10.076 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-06f177c4-ae08-46dd-9417-c35fb55a6429" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:05:20 -0400 (0:00:01.199) 0:17:11.276 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:05:21 -0400 (0:00:01.297) 0:17:12.574 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:05:22 -0400 (0:00:01.293) 0:17:13.868 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:05:23 -0400 (0:00:00.158) 0:17:14.027 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:05:24 -0400 (0:00:01.439) 0:17:15.466 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715308.5922108, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "508b4188162bbeb4b7002335ff74a84a1c673016", "ctime": 1776715300.0292478, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 358613126, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715300.0282478, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "795629973", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:05:25 -0400 (0:00:01.044) 0:17:16.511 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:05:26 -0400 (0:00:01.187) 0:17:17.698 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:05:28 -0400 (0:00:01.973) 0:17:19.672 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:467 Monday 20 April 2026 16:05:30 -0400 (0:00:01.480) 0:17:21.153 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:05:30 -0400 (0:00:00.314) 0:17:21.467 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:05:30 -0400 (0:00:00.282) 0:17:21.750 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:05:30 -0400 (0:00:00.125) 0:17:21.875 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "86806b93-e836-4dae-8489-248a606b36d4" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:05:31 -0400 (0:00:01.000) 0:17:22.876 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002353", "end": "2026-04-20 16:05:32.730717", "rc": 0, "start": "2026-04-20 16:05:32.728364" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:05:32 -0400 (0:00:00.893) 0:17:23.770 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002392", "end": "2026-04-20 16:05:33.915664", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:05:33.913272" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:05:34 -0400 (0:00:01.197) 0:17:24.968 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:05:34 -0400 (0:00:00.252) 0:17:25.220 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:05:34 -0400 (0:00:00.110) 0:17:25.330 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024716", "end": "2026-04-20 16:05:35.307367", "rc": 0, "start": "2026-04-20 16:05:35.282651" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:05:35 -0400 (0:00:01.011) 0:17:26.341 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:05:35 -0400 (0:00:00.264) 0:17:26.606 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:05:35 -0400 (0:00:00.270) 0:17:26.876 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:05:36 -0400 (0:00:00.268) 0:17:27.145 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:05:37 -0400 (0:00:01.178) 0:17:28.323 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:05:37 -0400 (0:00:00.110) 0:17:28.433 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:05:37 -0400 (0:00:00.103) 0:17:28.537 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:05:37 -0400 (0:00:00.185) 0:17:28.722 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:05:37 -0400 (0:00:00.102) 0:17:28.824 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:05:38 -0400 (0:00:00.139) 0:17:28.964 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:05:38 -0400 (0:00:00.167) 0:17:29.131 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:05:38 -0400 (0:00:00.128) 0:17:29.259 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:05:39 -0400 (0:00:01.052) 0:17:30.312 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "grow_supported.rc == 0", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:05:39 -0400 (0:00:00.120) 0:17:30.433 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:05:39 -0400 (0:00:00.218) 0:17:30.651 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:05:39 -0400 (0:00:00.056) 0:17:30.708 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:05:39 -0400 (0:00:00.152) 0:17:30.860 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:05:40 -0400 (0:00:00.144) 0:17:31.005 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:05:40 -0400 (0:00:00.143) 0:17:31.149 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:05:40 -0400 (0:00:00.044) 0:17:31.193 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:05:40 -0400 (0:00:00.034) 0:17:31.228 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:05:40 -0400 (0:00:00.052) 0:17:31.280 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:05:40 -0400 (0:00:00.072) 0:17:31.352 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:05:40 -0400 (0:00:00.030) 0:17:31.383 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:05:40 -0400 (0:00:00.029) 0:17:31.413 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:05:40 -0400 (0:00:00.049) 0:17:31.462 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:05:40 -0400 (0:00:00.201) 0:17:31.664 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:05:40 -0400 (0:00:00.130) 0:17:31.794 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:05:40 -0400 (0:00:00.100) 0:17:31.895 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:05:41 -0400 (0:00:00.131) 0:17:32.027 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:05:41 -0400 (0:00:00.115) 0:17:32.143 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:05:41 -0400 (0:00:00.123) 0:17:32.266 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:05:41 -0400 (0:00:00.134) 0:17:32.400 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:05:41 -0400 (0:00:00.168) 0:17:32.569 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:05:41 -0400 (0:00:00.088) 0:17:32.658 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:05:41 -0400 (0:00:00.126) 0:17:32.784 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:05:41 -0400 (0:00:00.081) 0:17:32.866 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:05:41 -0400 (0:00:00.038) 0:17:32.905 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:05:42 -0400 (0:00:00.056) 0:17:32.962 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:05:42 -0400 (0:00:00.163) 0:17:33.126 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:05:42 -0400 (0:00:00.143) 0:17:33.269 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:05:42 -0400 (0:00:00.201) 0:17:33.471 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:05:42 -0400 (0:00:00.163) 0:17:33.634 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:05:42 -0400 (0:00:00.115) 0:17:33.749 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:05:43 -0400 (0:00:00.183) 0:17:33.932 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:05:43 -0400 (0:00:00.205) 0:17:34.138 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:05:43 -0400 (0:00:00.156) 0:17:34.294 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:05:43 -0400 (0:00:00.119) 0:17:34.414 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:05:43 -0400 (0:00:00.112) 0:17:34.527 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:05:43 -0400 (0:00:00.196) 0:17:34.723 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:05:43 -0400 (0:00:00.177) 0:17:34.901 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:05:44 -0400 (0:00:00.174) 0:17:35.076 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:05:44 -0400 (0:00:00.336) 0:17:35.412 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:05:44 -0400 (0:00:00.372) 0:17:35.785 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:05:44 -0400 (0:00:00.140) 0:17:35.925 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:05:45 -0400 (0:00:00.164) 0:17:36.090 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:05:45 -0400 (0:00:00.192) 0:17:36.283 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:05:45 -0400 (0:00:00.105) 0:17:36.388 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:05:45 -0400 (0:00:00.193) 0:17:36.581 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:05:45 -0400 (0:00:00.117) 0:17:36.699 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:05:45 -0400 (0:00:00.071) 0:17:36.770 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:05:46 -0400 (0:00:00.282) 0:17:37.053 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:05:46 -0400 (0:00:00.274) 0:17:37.327 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:05:46 -0400 (0:00:00.176) 0:17:37.503 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:05:46 -0400 (0:00:00.177) 0:17:37.680 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:05:46 -0400 (0:00:00.194) 0:17:37.875 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:05:47 -0400 (0:00:00.189) 0:17:38.064 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:05:47 -0400 (0:00:00.102) 0:17:38.167 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:05:47 -0400 (0:00:00.210) 0:17:38.377 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:05:47 -0400 (0:00:00.092) 0:17:38.469 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:05:47 -0400 (0:00:00.098) 0:17:38.568 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:05:47 -0400 (0:00:00.051) 0:17:38.620 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:05:48 -0400 (0:00:00.363) 0:17:38.983 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:05:48 -0400 (0:00:00.128) 0:17:39.112 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:05:48 -0400 (0:00:00.124) 0:17:39.237 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:05:48 -0400 (0:00:00.064) 0:17:39.301 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:05:48 -0400 (0:00:00.079) 0:17:39.381 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:05:48 -0400 (0:00:00.115) 0:17:39.497 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:05:48 -0400 (0:00:00.196) 0:17:39.694 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:05:49 -0400 (0:00:00.327) 0:17:40.021 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:05:49 -0400 (0:00:00.116) 0:17:40.138 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:05:49 -0400 (0:00:00.064) 0:17:40.202 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:05:49 -0400 (0:00:00.091) 0:17:40.294 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:05:49 -0400 (0:00:00.117) 0:17:40.412 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:05:49 -0400 (0:00:00.253) 0:17:40.666 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:05:49 -0400 (0:00:00.120) 0:17:40.787 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:05:49 -0400 (0:00:00.089) 0:17:40.877 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:05:50 -0400 (0:00:00.058) 0:17:40.935 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:05:50 -0400 (0:00:00.096) 0:17:41.032 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:05:50 -0400 (0:00:00.083) 0:17:41.115 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:05:50 -0400 (0:00:00.124) 0:17:41.240 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:05:50 -0400 (0:00:00.188) 0:17:41.429 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715512.4953315, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715512.4953315, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 310736, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715512.4953315, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:05:51 -0400 (0:00:01.033) 0:17:42.463 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:05:51 -0400 (0:00:00.054) 0:17:42.517 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:05:51 -0400 (0:00:00.166) 0:17:42.683 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:05:52 -0400 (0:00:00.292) 0:17:42.976 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:05:52 -0400 (0:00:00.191) 0:17:43.167 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:05:52 -0400 (0:00:00.216) 0:17:43.384 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:05:52 -0400 (0:00:00.237) 0:17:43.621 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:05:52 -0400 (0:00:00.298) 0:17:43.920 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:05:56 -0400 (0:00:03.633) 0:17:47.553 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:05:56 -0400 (0:00:00.213) 0:17:47.767 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:05:57 -0400 (0:00:00.184) 0:17:47.951 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:05:57 -0400 (0:00:00.250) 0:17:48.202 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:05:57 -0400 (0:00:00.297) 0:17:48.500 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:05:57 -0400 (0:00:00.146) 0:17:48.646 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:05:57 -0400 (0:00:00.186) 0:17:48.833 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:05:58 -0400 (0:00:00.150) 0:17:48.983 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:05:58 -0400 (0:00:00.175) 0:17:49.158 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:05:58 -0400 (0:00:00.265) 0:17:49.423 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:05:58 -0400 (0:00:00.295) 0:17:49.719 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:05:58 -0400 (0:00:00.131) 0:17:49.850 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:05:59 -0400 (0:00:00.255) 0:17:50.105 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:05:59 -0400 (0:00:00.175) 0:17:50.281 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:05:59 -0400 (0:00:00.158) 0:17:50.440 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:05:59 -0400 (0:00:00.158) 0:17:50.598 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:05:59 -0400 (0:00:00.200) 0:17:50.799 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:06:00 -0400 (0:00:00.156) 0:17:50.955 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:06:00 -0400 (0:00:00.167) 0:17:51.123 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:06:00 -0400 (0:00:00.075) 0:17:51.199 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:06:00 -0400 (0:00:00.253) 0:17:51.452 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:06:00 -0400 (0:00:00.167) 0:17:51.619 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:06:00 -0400 (0:00:00.265) 0:17:51.884 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:06:01 -0400 (0:00:00.240) 0:17:52.124 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:06:01 -0400 (0:00:00.227) 0:17:52.352 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:06:02 -0400 (0:00:01.409) 0:17:53.761 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:06:04 -0400 (0:00:01.230) 0:17:54.991 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:06:04 -0400 (0:00:00.268) 0:17:55.259 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:06:04 -0400 (0:00:00.211) 0:17:55.471 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:06:05 -0400 (0:00:01.289) 0:17:56.761 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:06:05 -0400 (0:00:00.156) 0:17:56.917 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:06:06 -0400 (0:00:00.211) 0:17:57.129 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:06:06 -0400 (0:00:00.130) 0:17:57.260 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:06:06 -0400 (0:00:00.159) 0:17:57.419 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:06:06 -0400 (0:00:00.235) 0:17:57.654 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:06:06 -0400 (0:00:00.254) 0:17:57.909 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:06:07 -0400 (0:00:00.244) 0:17:58.153 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:06:07 -0400 (0:00:00.183) 0:17:58.337 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:06:07 -0400 (0:00:00.193) 0:17:58.530 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:06:07 -0400 (0:00:00.240) 0:17:58.771 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:06:08 -0400 (0:00:00.275) 0:17:59.046 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:06:08 -0400 (0:00:00.284) 0:17:59.331 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:06:08 -0400 (0:00:00.198) 0:17:59.529 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:06:08 -0400 (0:00:00.109) 0:17:59.639 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:06:08 -0400 (0:00:00.265) 0:17:59.904 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:06:09 -0400 (0:00:00.185) 0:18:00.090 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:06:09 -0400 (0:00:00.257) 0:18:00.348 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:06:09 -0400 (0:00:00.255) 0:18:00.603 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:06:09 -0400 (0:00:00.139) 0:18:00.743 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:06:09 -0400 (0:00:00.130) 0:18:00.873 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:06:10 -0400 (0:00:00.211) 0:18:01.085 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:06:10 -0400 (0:00:00.274) 0:18:01.360 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024746", "end": "2026-04-20 16:06:11.762054", "rc": 0, "start": "2026-04-20 16:06:11.737308" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:06:11 -0400 (0:00:01.473) 0:18:02.833 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:06:12 -0400 (0:00:00.137) 0:18:02.971 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:06:12 -0400 (0:00:00.131) 0:18:03.102 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:06:12 -0400 (0:00:00.098) 0:18:03.201 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:06:12 -0400 (0:00:00.075) 0:18:03.276 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:06:12 -0400 (0:00:00.048) 0:18:03.325 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:06:12 -0400 (0:00:00.195) 0:18:03.521 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:06:12 -0400 (0:00:00.175) 0:18:03.697 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:06:13 -0400 (0:00:00.726) 0:18:04.424 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 16:06:13 -0400 (0:00:00.116) 0:18:04.540 ********** changed: [managed-node16] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:473 Monday 20 April 2026 16:06:14 -0400 (0:00:01.233) 0:18:05.774 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node16 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:06:15 -0400 (0:00:00.358) 0:18:06.132 ********** ok: [managed-node16] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:06:15 -0400 (0:00:00.273) 0:18:06.406 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:06:15 -0400 (0:00:00.207) 0:18:06.613 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:06:15 -0400 (0:00:00.064) 0:18:06.678 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:06:15 -0400 (0:00:00.127) 0:18:06.805 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:06:16 -0400 (0:00:00.231) 0:18:07.037 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:06:16 -0400 (0:00:00.186) 0:18:07.223 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:06:18 -0400 (0:00:02.259) 0:18:09.483 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:06:19 -0400 (0:00:01.084) 0:18:10.568 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:06:19 -0400 (0:00:00.119) 0:18:10.687 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:06:19 -0400 (0:00:00.149) 0:18:10.837 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:06:20 -0400 (0:00:00.125) 0:18:10.963 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:06:20 -0400 (0:00:00.162) 0:18:11.125 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:06:20 -0400 (0:00:00.180) 0:18:11.305 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:06:20 -0400 (0:00:00.388) 0:18:11.694 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:06:20 -0400 (0:00:00.130) 0:18:11.824 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:06:21 -0400 (0:00:00.155) 0:18:11.980 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:06:24 -0400 (0:00:03.297) 0:18:15.278 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:06:24 -0400 (0:00:00.169) 0:18:15.447 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:06:24 -0400 (0:00:00.150) 0:18:15.598 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:06:29 -0400 (0:00:04.954) 0:18:20.553 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:06:29 -0400 (0:00:00.158) 0:18:20.711 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:06:29 -0400 (0:00:00.034) 0:18:20.745 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:06:29 -0400 (0:00:00.108) 0:18:20.854 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:06:30 -0400 (0:00:00.130) 0:18:20.985 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:06:33 -0400 (0:00:02.990) 0:18:23.975 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service": { "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:06:35 -0400 (0:00:02.084) 0:18:26.060 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-06f177c4-ae08-46dd-9417-c35fb55a6429", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-06f177c4-ae08-46dd-9417-c35fb55a6429 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:02 EDT", "StateChangeTimestampMonotonic": "2609073227", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:06:37 -0400 (0:00:02.218) 0:18:28.278 ********** fatal: [managed-node16]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:06:42 -0400 (0:00:04.790) 0:18:33.068 ********** fatal: [managed-node16]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:06:42 -0400 (0:00:00.253) 0:18:33.322 ********** changed: [managed-node16] => (item=systemd-cryptsetup@luks\x2d06f177c4\x2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d06f177c4\\x2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node16] => (item=systemd-cryptsetup@luks…2dae08\x2d46dd\x2d9417\x2dc35fb55a6429.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "name": "systemd-cryptsetup@luks…2dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\xe2\\x80\\xa62dae08\\x2d46dd\\x2d9417\\x2dc35fb55a6429.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:06:44 -0400 (0:00:02.440) 0:18:35.763 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:06:44 -0400 (0:00:00.149) 0:18:35.912 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:06:45 -0400 (0:00:00.193) 0:18:36.105 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 16:06:45 -0400 (0:00:00.185) 0:18:36.291 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715574.642064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715574.642064, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715574.642064, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "295634514", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 16:06:46 -0400 (0:00:00.976) 0:18:37.267 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:497 Monday 20 April 2026 16:06:46 -0400 (0:00:00.271) 0:18:37.538 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:06:47 -0400 (0:00:00.451) 0:18:37.989 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:06:47 -0400 (0:00:00.001) 0:18:37.991 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:06:47 -0400 (0:00:00.140) 0:18:38.131 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:06:47 -0400 (0:00:00.213) 0:18:38.344 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:06:47 -0400 (0:00:00.106) 0:18:38.451 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:06:49 -0400 (0:00:01.719) 0:18:40.170 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:06:49 -0400 (0:00:00.700) 0:18:40.871 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:06:50 -0400 (0:00:00.250) 0:18:41.121 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:06:50 -0400 (0:00:00.065) 0:18:41.187 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:06:50 -0400 (0:00:00.088) 0:18:41.275 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:06:50 -0400 (0:00:00.053) 0:18:41.329 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:06:50 -0400 (0:00:00.110) 0:18:41.439 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:06:50 -0400 (0:00:00.303) 0:18:41.742 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:06:50 -0400 (0:00:00.087) 0:18:41.830 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:06:51 -0400 (0:00:00.170) 0:18:42.000 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:06:54 -0400 (0:00:03.057) 0:18:45.057 ********** ok: [managed-node16] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:06:54 -0400 (0:00:00.203) 0:18:45.260 ********** ok: [managed-node16] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:06:54 -0400 (0:00:00.087) 0:18:45.348 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:06:59 -0400 (0:00:04.984) 0:18:50.333 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:06:59 -0400 (0:00:00.092) 0:18:50.426 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:06:59 -0400 (0:00:00.117) 0:18:50.543 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:06:59 -0400 (0:00:00.103) 0:18:50.647 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:06:59 -0400 (0:00:00.043) 0:18:50.690 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:07:02 -0400 (0:00:03.209) 0:18:53.899 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:07:05 -0400 (0:00:02.331) 0:18:56.230 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:07:05 -0400 (0:00:00.195) 0:18:56.425 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:07:18 -0400 (0:00:13.055) 0:19:09.481 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:07:18 -0400 (0:00:00.075) 0:19:09.556 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715522.7982872, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1776715522.7952871, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715522.7952871, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:07:19 -0400 (0:00:00.687) 0:19:10.243 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:07:20 -0400 (0:00:00.778) 0:19:11.022 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:07:20 -0400 (0:00:00.219) 0:19:11.242 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:07:20 -0400 (0:00:00.223) 0:19:11.465 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:07:20 -0400 (0:00:00.165) 0:19:11.631 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:07:20 -0400 (0:00:00.116) 0:19:11.747 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:07:22 -0400 (0:00:01.208) 0:19:12.955 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:07:23 -0400 (0:00:01.196) 0:19:14.152 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:07:23 -0400 (0:00:00.544) 0:19:14.697 ********** skipping: [managed-node16] => (item={'src': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:07:23 -0400 (0:00:00.129) 0:19:14.826 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:07:25 -0400 (0:00:01.170) 0:19:15.997 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715533.9142394, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715526.5952709, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 165675208, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776715526.5932708, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1209341374", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:07:25 -0400 (0:00:00.843) 0:19:16.840 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:07:26 -0400 (0:00:00.709) 0:19:17.550 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:07:27 -0400 (0:00:01.235) 0:19:18.786 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:513 Monday 20 April 2026 16:07:28 -0400 (0:00:00.812) 0:19:19.598 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:07:28 -0400 (0:00:00.255) 0:19:19.854 ********** ok: [managed-node16] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:07:29 -0400 (0:00:00.109) 0:19:19.985 ********** skipping: [managed-node16] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:07:29 -0400 (0:00:00.097) 0:19:20.083 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "cd114bdc-ba0a-49ea-b802-bb63606fbf6d" }, "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "size": "4G", "type": "crypt", "uuid": "2f1d089b-ee81-4fe6-a4a9-3d102c821043" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:07:29 -0400 (0:00:00.729) 0:19:20.812 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003803", "end": "2026-04-20 16:07:31.745994", "rc": 0, "start": "2026-04-20 16:07:30.742191" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:07:31 -0400 (0:00:01.966) 0:19:22.779 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002525", "end": "2026-04-20 16:07:32.417417", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:07:32.414892" } STDOUT: luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:07:32 -0400 (0:00:00.649) 0:19:23.428 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node16 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:07:32 -0400 (0:00:00.163) 0:19:23.592 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:07:32 -0400 (0:00:00.123) 0:19:23.715 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.027020", "end": "2026-04-20 16:07:33.485822", "rc": 0, "start": "2026-04-20 16:07:33.458802" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:07:33 -0400 (0:00:00.782) 0:19:24.498 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:07:33 -0400 (0:00:00.081) 0:19:24.579 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node16 => (item=members) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node16 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:07:33 -0400 (0:00:00.238) 0:19:24.817 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:07:34 -0400 (0:00:00.181) 0:19:24.999 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:07:34 -0400 (0:00:00.862) 0:19:25.861 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:07:35 -0400 (0:00:00.123) 0:19:25.985 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:07:35 -0400 (0:00:00.135) 0:19:26.120 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:07:35 -0400 (0:00:00.194) 0:19:26.314 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:07:35 -0400 (0:00:00.131) 0:19:26.446 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:07:35 -0400 (0:00:00.195) 0:19:26.641 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:07:35 -0400 (0:00:00.195) 0:19:26.836 ********** ok: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:07:36 -0400 (0:00:00.155) 0:19:26.992 ********** ok: [managed-node16] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.41.179 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:07:37 -0400 (0:00:01.108) 0:19:28.101 ********** skipping: [managed-node16] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "grow_supported.rc == 0", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:07:37 -0400 (0:00:00.076) 0:19:28.177 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node16 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:07:37 -0400 (0:00:00.130) 0:19:28.308 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:07:37 -0400 (0:00:00.047) 0:19:28.356 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:07:37 -0400 (0:00:00.028) 0:19:28.384 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:07:37 -0400 (0:00:00.045) 0:19:28.430 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:07:37 -0400 (0:00:00.058) 0:19:28.489 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:07:37 -0400 (0:00:00.124) 0:19:28.613 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:07:37 -0400 (0:00:00.077) 0:19:28.691 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:07:37 -0400 (0:00:00.037) 0:19:28.728 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:07:37 -0400 (0:00:00.110) 0:19:28.839 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:07:37 -0400 (0:00:00.071) 0:19:28.910 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:07:38 -0400 (0:00:00.056) 0:19:28.967 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:07:38 -0400 (0:00:00.111) 0:19:29.078 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node16 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:07:38 -0400 (0:00:00.275) 0:19:29.354 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:07:38 -0400 (0:00:00.189) 0:19:29.543 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:07:38 -0400 (0:00:00.125) 0:19:29.669 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:07:38 -0400 (0:00:00.094) 0:19:29.764 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:07:39 -0400 (0:00:00.201) 0:19:29.965 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:07:39 -0400 (0:00:00.153) 0:19:30.118 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:07:39 -0400 (0:00:00.109) 0:19:30.228 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:07:39 -0400 (0:00:00.111) 0:19:30.339 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:07:39 -0400 (0:00:00.059) 0:19:30.399 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node16 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:07:39 -0400 (0:00:00.158) 0:19:30.558 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:07:39 -0400 (0:00:00.154) 0:19:30.713 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:07:39 -0400 (0:00:00.174) 0:19:30.887 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:07:40 -0400 (0:00:00.099) 0:19:30.987 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:07:40 -0400 (0:00:00.078) 0:19:31.066 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:07:40 -0400 (0:00:00.063) 0:19:31.129 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node16 TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:07:40 -0400 (0:00:00.272) 0:19:31.401 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:07:40 -0400 (0:00:00.117) 0:19:31.519 ********** skipping: [managed-node16] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:07:40 -0400 (0:00:00.146) 0:19:31.666 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node16 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:07:40 -0400 (0:00:00.183) 0:19:31.849 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:07:41 -0400 (0:00:00.164) 0:19:32.013 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:07:41 -0400 (0:00:00.215) 0:19:32.229 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:07:41 -0400 (0:00:00.137) 0:19:32.366 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:07:41 -0400 (0:00:00.119) 0:19:32.485 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:07:41 -0400 (0:00:00.042) 0:19:32.528 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:07:41 -0400 (0:00:00.065) 0:19:32.594 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:07:41 -0400 (0:00:00.124) 0:19:32.719 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node16 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:07:41 -0400 (0:00:00.151) 0:19:32.870 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:07:42 -0400 (0:00:00.189) 0:19:33.059 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:07:42 -0400 (0:00:00.114) 0:19:33.173 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:07:42 -0400 (0:00:00.076) 0:19:33.250 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:07:42 -0400 (0:00:00.283) 0:19:33.533 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:07:43 -0400 (0:00:00.757) 0:19:34.291 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:07:43 -0400 (0:00:00.242) 0:19:34.533 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:07:43 -0400 (0:00:00.192) 0:19:34.725 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:07:43 -0400 (0:00:00.173) 0:19:34.898 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node16 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:07:44 -0400 (0:00:00.483) 0:19:35.381 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:07:44 -0400 (0:00:00.169) 0:19:35.551 ********** skipping: [managed-node16] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:07:44 -0400 (0:00:00.205) 0:19:35.757 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:07:44 -0400 (0:00:00.157) 0:19:35.915 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:07:45 -0400 (0:00:00.144) 0:19:36.060 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:07:45 -0400 (0:00:00.322) 0:19:36.383 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:07:45 -0400 (0:00:00.153) 0:19:36.536 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:07:45 -0400 (0:00:00.146) 0:19:36.682 ********** ok: [managed-node16] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:07:45 -0400 (0:00:00.173) 0:19:36.856 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:07:46 -0400 (0:00:00.169) 0:19:37.026 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:07:46 -0400 (0:00:00.275) 0:19:37.302 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:07:47 -0400 (0:00:00.772) 0:19:38.074 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:07:47 -0400 (0:00:00.198) 0:19:38.273 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:07:47 -0400 (0:00:00.179) 0:19:38.452 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:07:47 -0400 (0:00:00.204) 0:19:38.657 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:07:47 -0400 (0:00:00.171) 0:19:38.829 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:07:48 -0400 (0:00:00.146) 0:19:38.975 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:07:48 -0400 (0:00:00.135) 0:19:39.111 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:07:48 -0400 (0:00:00.141) 0:19:39.252 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:07:48 -0400 (0:00:00.248) 0:19:39.501 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:07:48 -0400 (0:00:00.139) 0:19:39.641 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:07:48 -0400 (0:00:00.157) 0:19:39.798 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:07:49 -0400 (0:00:00.182) 0:19:39.981 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:07:49 -0400 (0:00:00.463) 0:19:40.445 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:07:49 -0400 (0:00:00.184) 0:19:40.629 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:07:49 -0400 (0:00:00.217) 0:19:40.847 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:07:50 -0400 (0:00:00.199) 0:19:41.047 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:07:50 -0400 (0:00:00.114) 0:19:41.162 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:07:50 -0400 (0:00:00.158) 0:19:41.321 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:07:50 -0400 (0:00:00.260) 0:19:41.581 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:07:50 -0400 (0:00:00.194) 0:19:41.776 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715638.2337902, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715638.2337902, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 310736, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715638.2337902, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:07:51 -0400 (0:00:01.114) 0:19:42.890 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:07:52 -0400 (0:00:00.232) 0:19:43.122 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:07:52 -0400 (0:00:00.175) 0:19:43.298 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:07:52 -0400 (0:00:00.222) 0:19:43.521 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:07:52 -0400 (0:00:00.202) 0:19:43.723 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:07:52 -0400 (0:00:00.065) 0:19:43.789 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:07:53 -0400 (0:00:00.160) 0:19:43.949 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715638.3687897, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715638.3687897, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 329336, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715638.3687897, "nlink": 1, "path": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:07:54 -0400 (0:00:00.990) 0:19:44.939 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:07:56 -0400 (0:00:02.896) 0:19:47.836 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009649", "end": "2026-04-20 16:07:57.759812", "rc": 0, "start": "2026-04-20 16:07:57.750163" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: cd114bdc-ba0a-49ea-b802-bb63606fbf6d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 933875 Threads: 2 Salt: 50 59 5c 00 2a d1 a8 8c 9c de 00 07 c3 7c 72 ca 27 2a bd 34 3d 1b 53 cd 9d 86 78 3b ed e5 94 05 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: a7 21 b1 bd e0 e7 2d 5c 2d d6 53 d7 d7 db 47 b8 af 08 e1 76 e7 51 2e 39 c8 10 95 21 dc bc b8 c4 Digest: 37 e8 47 5f 77 88 5a 71 e7 a3 89 f2 0d 68 92 79 19 7d 19 7e bb 38 27 a7 6c 8b f7 b4 41 bc b9 bd TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:07:57 -0400 (0:00:00.957) 0:19:48.793 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:07:58 -0400 (0:00:00.158) 0:19:48.951 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:07:58 -0400 (0:00:00.194) 0:19:49.145 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:07:58 -0400 (0:00:00.235) 0:19:49.381 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:07:58 -0400 (0:00:00.163) 0:19:49.544 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:07:58 -0400 (0:00:00.186) 0:19:49.731 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:07:58 -0400 (0:00:00.138) 0:19:49.869 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:07:59 -0400 (0:00:00.223) 0:19:50.092 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:07:59 -0400 (0:00:00.247) 0:19:50.340 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:07:59 -0400 (0:00:00.185) 0:19:50.525 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:07:59 -0400 (0:00:00.046) 0:19:50.572 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:07:59 -0400 (0:00:00.052) 0:19:50.624 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:07:59 -0400 (0:00:00.239) 0:19:50.864 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:07:59 -0400 (0:00:00.034) 0:19:50.898 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:08:00 -0400 (0:00:00.051) 0:19:50.950 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:08:00 -0400 (0:00:00.038) 0:19:50.988 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:08:00 -0400 (0:00:00.044) 0:19:51.033 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:08:00 -0400 (0:00:00.101) 0:19:51.134 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:08:00 -0400 (0:00:00.145) 0:19:51.280 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:08:00 -0400 (0:00:00.050) 0:19:51.330 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:08:00 -0400 (0:00:00.031) 0:19:51.362 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:08:00 -0400 (0:00:00.055) 0:19:51.417 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:08:00 -0400 (0:00:00.036) 0:19:51.453 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:08:00 -0400 (0:00:00.038) 0:19:51.492 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:08:01 -0400 (0:00:01.128) 0:19:52.621 ********** ok: [managed-node16] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:08:02 -0400 (0:00:00.776) 0:19:53.398 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:08:02 -0400 (0:00:00.133) 0:19:53.532 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:08:02 -0400 (0:00:00.096) 0:19:53.628 ********** ok: [managed-node16] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:08:03 -0400 (0:00:00.753) 0:19:54.382 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:08:03 -0400 (0:00:00.083) 0:19:54.466 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:08:03 -0400 (0:00:00.081) 0:19:54.547 ********** skipping: [managed-node16] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:08:03 -0400 (0:00:00.069) 0:19:54.616 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:08:03 -0400 (0:00:00.062) 0:19:54.679 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:08:03 -0400 (0:00:00.066) 0:19:54.746 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:08:03 -0400 (0:00:00.079) 0:19:54.826 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:08:03 -0400 (0:00:00.091) 0:19:54.917 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:08:04 -0400 (0:00:00.261) 0:19:55.179 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:08:04 -0400 (0:00:00.129) 0:19:55.309 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:08:04 -0400 (0:00:00.085) 0:19:55.394 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:08:04 -0400 (0:00:00.122) 0:19:55.517 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:08:04 -0400 (0:00:00.091) 0:19:55.609 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:08:04 -0400 (0:00:00.097) 0:19:55.706 ********** skipping: [managed-node16] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:08:04 -0400 (0:00:00.174) 0:19:55.880 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:08:05 -0400 (0:00:00.129) 0:19:56.010 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:08:05 -0400 (0:00:00.090) 0:19:56.101 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:08:05 -0400 (0:00:00.125) 0:19:56.227 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:08:05 -0400 (0:00:00.058) 0:19:56.285 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:08:05 -0400 (0:00:00.219) 0:19:56.504 ********** ok: [managed-node16] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:08:05 -0400 (0:00:00.150) 0:19:56.655 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:08:05 -0400 (0:00:00.218) 0:19:56.874 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:08:06 -0400 (0:00:00.433) 0:19:57.308 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023473", "end": "2026-04-20 16:08:07.600880", "rc": 0, "start": "2026-04-20 16:08:07.577407" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:08:07 -0400 (0:00:01.395) 0:19:58.704 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:08:08 -0400 (0:00:00.240) 0:19:58.945 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:08:08 -0400 (0:00:00.236) 0:19:59.181 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:08:08 -0400 (0:00:00.150) 0:19:59.332 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:08:08 -0400 (0:00:00.131) 0:19:59.463 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:08:08 -0400 (0:00:00.119) 0:19:59.583 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:08:08 -0400 (0:00:00.144) 0:19:59.727 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:08:08 -0400 (0:00:00.149) 0:19:59.876 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:08:09 -0400 (0:00:00.194) 0:20:00.071 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:516 Monday 20 April 2026 16:08:09 -0400 (0:00:00.152) 0:20:00.224 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node16 TASK [Clear facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Monday 20 April 2026 16:08:09 -0400 (0:00:00.439) 0:20:00.663 ********** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:08:09 -0400 (0:00:00.023) 0:20:00.687 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:08:09 -0400 (0:00:00.205) 0:20:00.892 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:08:10 -0400 (0:00:00.312) 0:20:01.204 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:08:10 -0400 (0:00:00.124) 0:20:01.329 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:08:12 -0400 (0:00:01.724) 0:20:03.053 ********** ok: [managed-node16] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:08:13 -0400 (0:00:01.040) 0:20:04.094 ********** skipping: [managed-node16] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node16] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node16] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:08:13 -0400 (0:00:00.425) 0:20:04.519 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:08:13 -0400 (0:00:00.207) 0:20:04.726 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:08:13 -0400 (0:00:00.166) 0:20:04.892 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:08:14 -0400 (0:00:00.144) 0:20:05.037 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:08:14 -0400 (0:00:00.200) 0:20:05.238 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:08:14 -0400 (0:00:00.405) 0:20:05.643 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:08:14 -0400 (0:00:00.138) 0:20:05.781 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:08:15 -0400 (0:00:00.222) 0:20:06.004 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:08:18 -0400 (0:00:03.439) 0:20:09.444 ********** ok: [managed-node16] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:08:18 -0400 (0:00:00.126) 0:20:09.570 ********** ok: [managed-node16] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:08:18 -0400 (0:00:00.181) 0:20:09.751 ********** ok: [managed-node16] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:08:23 -0400 (0:00:04.723) 0:20:14.475 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node16 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:08:23 -0400 (0:00:00.065) 0:20:14.540 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:08:23 -0400 (0:00:00.067) 0:20:14.608 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:08:23 -0400 (0:00:00.082) 0:20:14.690 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:08:23 -0400 (0:00:00.101) 0:20:14.792 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:08:27 -0400 (0:00:03.474) 0:20:18.267 ********** ok: [managed-node16] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "active" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:08:29 -0400 (0:00:02.610) 0:20:20.877 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:08:30 -0400 (0:00:00.288) 0:20:21.166 ********** changed: [managed-node16] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:08:36 -0400 (0:00:05.954) 0:20:27.120 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:08:36 -0400 (0:00:00.079) 0:20:27.200 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715643.6477668, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fea6f2c2492f1af4be8839f261d06092fc5f0052", "ctime": 1776715643.644767, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 423624838, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715643.644767, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2701322441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:08:37 -0400 (0:00:00.861) 0:20:28.061 ********** ok: [managed-node16] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:08:37 -0400 (0:00:00.824) 0:20:28.886 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:08:38 -0400 (0:00:00.264) 0:20:29.151 ********** ok: [managed-node16] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:08:38 -0400 (0:00:00.150) 0:20:29.301 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:08:38 -0400 (0:00:00.074) 0:20:29.375 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:08:38 -0400 (0:00:00.145) 0:20:29.521 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node16] => (item={'src': '/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:08:39 -0400 (0:00:00.650) 0:20:30.172 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:08:40 -0400 (0:00:01.046) 0:20:31.218 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:08:40 -0400 (0:00:00.083) 0:20:31.301 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:08:40 -0400 (0:00:00.074) 0:20:31.376 ********** ok: [managed-node16] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:08:41 -0400 (0:00:01.129) 0:20:32.506 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715652.4157293, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3a33be1bce92849269ec96b879d6d0965f5d1559", "ctime": 1776715646.5217545, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 341835908, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715646.5207546, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1343602188", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:08:42 -0400 (0:00:01.235) 0:20:33.741 ********** changed: [managed-node16] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-cd114bdc-ba0a-49ea-b802-bb63606fbf6d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:08:44 -0400 (0:00:01.212) 0:20:34.953 ********** ok: [managed-node16] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:08:45 -0400 (0:00:01.708) 0:20:36.661 ********** ok: [managed-node16] => { "changed": false } TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:525 Monday 20 April 2026 16:08:46 -0400 (0:00:01.151) 0:20:37.813 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node16 TASK [Print out pool information] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:08:47 -0400 (0:00:00.412) 0:20:38.226 ********** skipping: [managed-node16] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:08:47 -0400 (0:00:00.173) 0:20:38.399 ********** ok: [managed-node16] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:08:47 -0400 (0:00:00.232) 0:20:38.631 ********** ok: [managed-node16] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:08:48 -0400 (0:00:00.923) 0:20:39.555 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003618", "end": "2026-04-20 16:08:50.537808", "rc": 0, "start": "2026-04-20 16:08:49.534190" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:08:50 -0400 (0:00:02.006) 0:20:41.562 ********** ok: [managed-node16] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002717", "end": "2026-04-20 16:08:51.535923", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:08:51.533206" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:08:51 -0400 (0:00:00.980) 0:20:42.542 ********** skipping: [managed-node16] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:08:51 -0400 (0:00:00.104) 0:20:42.647 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node16 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=tsqvh6-zQnY-J0yu-7Snb-q1YD-mcod-Qfk8qg'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:08:51 -0400 (0:00:00.157) 0:20:42.805 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:08:52 -0400 (0:00:00.205) 0:20:43.010 ********** included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node16 => (item=mount) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node16 => (item=fstab) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node16 => (item=fs) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node16 => (item=device) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node16 => (item=encryption) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node16 => (item=md) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node16 => (item=size) included: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node16 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:08:52 -0400 (0:00:00.585) 0:20:43.596 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:08:52 -0400 (0:00:00.076) 0:20:43.673 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:08:52 -0400 (0:00:00.134) 0:20:43.808 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:08:52 -0400 (0:00:00.087) 0:20:43.895 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:08:53 -0400 (0:00:00.066) 0:20:43.962 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:08:53 -0400 (0:00:00.046) 0:20:44.009 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:08:53 -0400 (0:00:00.056) 0:20:44.065 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:08:53 -0400 (0:00:00.040) 0:20:44.105 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:08:53 -0400 (0:00:00.102) 0:20:44.208 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:08:53 -0400 (0:00:00.079) 0:20:44.287 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:08:53 -0400 (0:00:00.063) 0:20:44.351 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:08:53 -0400 (0:00:00.176) 0:20:44.527 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:08:53 -0400 (0:00:00.362) 0:20:44.889 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:08:54 -0400 (0:00:00.086) 0:20:44.976 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:08:54 -0400 (0:00:00.090) 0:20:45.066 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:08:54 -0400 (0:00:00.091) 0:20:45.157 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:08:54 -0400 (0:00:00.120) 0:20:45.278 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:08:54 -0400 (0:00:00.173) 0:20:45.451 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:08:54 -0400 (0:00:00.109) 0:20:45.561 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:08:54 -0400 (0:00:00.104) 0:20:45.666 ********** ok: [managed-node16] => { "changed": false, "stat": { "atime": 1776715716.0134554, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715716.0134554, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776715716.0134554, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:08:55 -0400 (0:00:01.181) 0:20:46.847 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:08:56 -0400 (0:00:00.294) 0:20:47.141 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:08:56 -0400 (0:00:00.144) 0:20:47.286 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:08:56 -0400 (0:00:00.154) 0:20:47.440 ********** ok: [managed-node16] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:08:56 -0400 (0:00:00.123) 0:20:47.564 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:08:56 -0400 (0:00:00.116) 0:20:47.681 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:08:56 -0400 (0:00:00.075) 0:20:47.756 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:08:56 -0400 (0:00:00.155) 0:20:47.911 ********** ok: [managed-node16] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:09:00 -0400 (0:00:03.181) 0:20:51.093 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:09:00 -0400 (0:00:00.102) 0:20:51.195 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:09:00 -0400 (0:00:00.137) 0:20:51.333 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:09:00 -0400 (0:00:00.194) 0:20:51.527 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:09:00 -0400 (0:00:00.142) 0:20:51.669 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:09:00 -0400 (0:00:00.138) 0:20:51.808 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:09:00 -0400 (0:00:00.108) 0:20:51.917 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:09:01 -0400 (0:00:00.090) 0:20:52.008 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:09:01 -0400 (0:00:00.063) 0:20:52.071 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:09:01 -0400 (0:00:00.211) 0:20:52.282 ********** ok: [managed-node16] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:09:01 -0400 (0:00:00.235) 0:20:52.518 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:09:01 -0400 (0:00:00.207) 0:20:52.725 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:09:01 -0400 (0:00:00.189) 0:20:52.915 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:09:02 -0400 (0:00:00.201) 0:20:53.117 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:09:02 -0400 (0:00:00.175) 0:20:53.292 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:09:02 -0400 (0:00:00.157) 0:20:53.449 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:09:02 -0400 (0:00:00.192) 0:20:53.642 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:09:02 -0400 (0:00:00.168) 0:20:53.811 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:09:03 -0400 (0:00:00.118) 0:20:53.929 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:09:03 -0400 (0:00:00.095) 0:20:54.025 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:09:03 -0400 (0:00:00.192) 0:20:54.217 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:09:03 -0400 (0:00:00.131) 0:20:54.349 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:09:03 -0400 (0:00:00.097) 0:20:54.447 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:09:03 -0400 (0:00:00.092) 0:20:54.540 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:09:04 -0400 (0:00:00.728) 0:20:55.269 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:09:04 -0400 (0:00:00.087) 0:20:55.357 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:09:04 -0400 (0:00:00.093) 0:20:55.450 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:09:04 -0400 (0:00:00.051) 0:20:55.502 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:09:04 -0400 (0:00:00.069) 0:20:55.571 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:09:04 -0400 (0:00:00.038) 0:20:55.610 ********** skipping: [managed-node16] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:09:04 -0400 (0:00:00.022) 0:20:55.632 ********** skipping: [managed-node16] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:09:04 -0400 (0:00:00.046) 0:20:55.678 ********** skipping: [managed-node16] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:09:04 -0400 (0:00:00.072) 0:20:55.750 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:09:04 -0400 (0:00:00.041) 0:20:55.792 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:09:04 -0400 (0:00:00.094) 0:20:55.887 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:09:05 -0400 (0:00:00.054) 0:20:55.941 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:09:05 -0400 (0:00:00.216) 0:20:56.157 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:09:05 -0400 (0:00:00.103) 0:20:56.261 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:09:05 -0400 (0:00:00.115) 0:20:56.376 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:09:05 -0400 (0:00:00.086) 0:20:56.462 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:09:05 -0400 (0:00:00.106) 0:20:56.569 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:09:05 -0400 (0:00:00.144) 0:20:56.713 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:09:05 -0400 (0:00:00.067) 0:20:56.781 ********** skipping: [managed-node16] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:09:05 -0400 (0:00:00.093) 0:20:56.875 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:09:06 -0400 (0:00:00.131) 0:20:57.007 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:09:06 -0400 (0:00:00.082) 0:20:57.089 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:09:06 -0400 (0:00:00.115) 0:20:57.204 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:09:06 -0400 (0:00:00.097) 0:20:57.301 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:09:06 -0400 (0:00:00.196) 0:20:57.498 ********** ok: [managed-node16] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:09:06 -0400 (0:00:00.085) 0:20:57.584 ********** ok: [managed-node16] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:09:06 -0400 (0:00:00.078) 0:20:57.663 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:09:06 -0400 (0:00:00.179) 0:20:57.842 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:09:07 -0400 (0:00:00.144) 0:20:57.987 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:09:07 -0400 (0:00:00.122) 0:20:58.109 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:09:07 -0400 (0:00:00.056) 0:20:58.166 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:09:07 -0400 (0:00:00.147) 0:20:58.313 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:09:07 -0400 (0:00:00.122) 0:20:58.435 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:09:07 -0400 (0:00:00.143) 0:20:58.579 ********** skipping: [managed-node16] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:09:07 -0400 (0:00:00.097) 0:20:58.677 ********** ok: [managed-node16] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:09:07 -0400 (0:00:00.135) 0:20:58.813 ********** ok: [managed-node16] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node16 : ok=1283 changed=60 unreachable=0 failed=0 skipped=1110 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:49:23.401038+00:00Z", "host": "managed-node16", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-20T19:49:18.401305+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:49:23.623173+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:49:23.407821+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:51:20.686695+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-72e0c453-925f-4c37-ac54-222606f3f95e' in safe mode due to encryption removal", "start_time": "2026-04-20T19:51:15.654122+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:51:20.789851+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-72e0c453-925f-4c37-ac54-222606f3f95e' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:51:20.693681+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:52:56.246631+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-20T19:52:50.808780+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:52:56.324598+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:52:56.253990+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:54:43.983557+00:00Z", "host": "managed-node16", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T19:54:39.056278+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:54:44.137535+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:54:43.990716+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:56:42.627483+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2' in safe mode due to encryption removal", "start_time": "2026-04-20T19:56:37.603213+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:56:42.759553+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-228a619c-db48-4cff-ab7e-39c8c44696a2' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:56:42.635650+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:58:45.324366+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-20T19:58:40.753484+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T19:58:45.374924+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:58:45.333776+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:00:50.334907+00:00Z", "host": "managed-node16", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T20:00:45.173549+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:00:50.639902+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:00:50.390925+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:04:42.739036+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429' in safe mode due to encryption removal", "start_time": "2026-04-20T20:04:37.150759+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:04:42.862224+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-06f177c4-ae08-46dd-9417-c35fb55a6429' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:04:42.762620+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:06:42.134877+00:00Z", "host": "managed-node16", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-20T20:06:37.353310+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.16.18", "end_time": "2026-04-20T20:06:42.388358+00:00Z", "host": "managed-node16", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:06:42.184819+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Monday 20 April 2026 16:09:07 -0400 (0:00:00.110) 0:20:58.923 ********** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.66s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.07s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.79s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.34s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.06s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.72s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Find unused disks in the system ----------------------------------------- 7.19s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.07s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.95s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.75s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.60s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.55s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.45s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.45s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.41s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Make sure blivet is available ------- 5.39s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 fedora.linux_system_roles.storage : Get required packages --------------- 5.38s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.25s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.24s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.22s /tmp/collections-ysS/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88