ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Tuesday 13 May 2025 10:46:43 -0400 (0:00:00.246) 0:00:00.246 *********** ok: [managed-node8] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Tuesday 13 May 2025 10:46:47 -0400 (0:00:03.852) 0:00:04.098 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Tuesday 13 May 2025 10:46:48 -0400 (0:00:00.425) 0:00:04.524 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Tuesday 13 May 2025 10:46:48 -0400 (0:00:00.351) 0:00:04.876 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Tuesday 13 May 2025 10:46:48 -0400 (0:00:00.418) 0:00:05.294 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Tuesday 13 May 2025 10:46:49 -0400 (0:00:00.358) 0:00:05.652 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Tuesday 13 May 2025 10:46:49 -0400 (0:00:00.296) 0:00:05.948 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Tuesday 13 May 2025 10:46:49 -0400 (0:00:00.319) 0:00:06.268 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Tuesday 13 May 2025 10:46:50 -0400 (0:00:00.225) 0:00:06.494 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:46:50 -0400 (0:00:00.225) 0:00:06.720 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:46:50 -0400 (0:00:00.229) 0:00:06.949 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:46:50 -0400 (0:00:00.414) 0:00:07.363 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:46:51 -0400 (0:00:00.628) 0:00:07.991 *********** ok: [managed-node8] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:46:53 -0400 (0:00:02.151) 0:00:10.143 *********** ok: [managed-node8] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:46:54 -0400 (0:00:00.406) 0:00:10.549 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:46:54 -0400 (0:00:00.090) 0:00:10.639 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:46:54 -0400 (0:00:00.162) 0:00:10.802 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:46:55 -0400 (0:00:00.767) 0:00:11.569 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:47:01 -0400 (0:00:06.373) 0:00:17.943 *********** ok: [managed-node8] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:47:01 -0400 (0:00:00.403) 0:00:18.347 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:47:02 -0400 (0:00:00.514) 0:00:18.861 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:47:06 -0400 (0:00:03.586) 0:00:22.448 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:47:06 -0400 (0:00:00.751) 0:00:23.200 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:47:06 -0400 (0:00:00.115) 0:00:23.315 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:47:07 -0400 (0:00:00.172) 0:00:23.487 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:47:07 -0400 (0:00:00.154) 0:00:23.641 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:47:11 -0400 (0:00:04.526) 0:00:28.168 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:47:15 -0400 (0:00:03.776) 0:00:31.945 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:47:16 -0400 (0:00:00.786) 0:00:32.732 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:47:16 -0400 (0:00:00.224) 0:00:32.956 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:47:18 -0400 (0:00:01.930) 0:00:34.886 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:47:18 -0400 (0:00:00.310) 0:00:35.196 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747146747.7113545, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1747146730.2753284, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747146730.2743285, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:47:20 -0400 (0:00:01.273) 0:00:36.470 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:47:20 -0400 (0:00:00.215) 0:00:36.686 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:47:20 -0400 (0:00:00.215) 0:00:36.902 *********** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:47:20 -0400 (0:00:00.243) 0:00:37.146 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:47:20 -0400 (0:00:00.253) 0:00:37.399 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:47:21 -0400 (0:00:00.163) 0:00:37.563 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:47:21 -0400 (0:00:00.126) 0:00:37.690 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:47:21 -0400 (0:00:00.296) 0:00:37.986 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:47:21 -0400 (0:00:00.252) 0:00:38.238 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:47:21 -0400 (0:00:00.176) 0:00:38.414 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:47:22 -0400 (0:00:00.178) 0:00:38.593 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747146524.672022, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:47:23 -0400 (0:00:01.472) 0:00:40.065 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:47:23 -0400 (0:00:00.210) 0:00:40.276 *********** ok: [managed-node8] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:76 Tuesday 13 May 2025 10:47:25 -0400 (0:00:01.930) 0:00:42.207 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node8 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Tuesday 13 May 2025 10:47:26 -0400 (0:00:00.394) 0:00:42.601 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Tuesday 13 May 2025 10:47:30 -0400 (0:00:04.596) 0:00:47.198 *********** ok: [managed-node8] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Tuesday 13 May 2025 10:47:32 -0400 (0:00:02.021) 0:00:49.220 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Tuesday 13 May 2025 10:47:32 -0400 (0:00:00.189) 0:00:49.409 *********** ok: [managed-node8] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Tuesday 13 May 2025 10:47:33 -0400 (0:00:00.255) 0:00:49.665 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Tuesday 13 May 2025 10:47:33 -0400 (0:00:00.206) 0:00:49.872 *********** ok: [managed-node8] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:85 Tuesday 13 May 2025 10:47:33 -0400 (0:00:00.258) 0:00:50.130 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:47:33 -0400 (0:00:00.228) 0:00:50.359 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:47:34 -0400 (0:00:00.232) 0:00:50.591 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:47:34 -0400 (0:00:00.266) 0:00:50.858 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:47:34 -0400 (0:00:00.328) 0:00:51.187 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:47:35 -0400 (0:00:00.439) 0:00:51.627 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:47:35 -0400 (0:00:00.333) 0:00:51.961 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:47:35 -0400 (0:00:00.151) 0:00:52.113 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:47:35 -0400 (0:00:00.125) 0:00:52.238 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:47:35 -0400 (0:00:00.120) 0:00:52.359 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:47:36 -0400 (0:00:00.136) 0:00:52.496 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:47:36 -0400 (0:00:00.352) 0:00:52.848 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:47:40 -0400 (0:00:03.917) 0:00:56.765 *********** ok: [managed-node8] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:47:40 -0400 (0:00:00.163) 0:00:56.929 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:47:40 -0400 (0:00:00.205) 0:00:57.135 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:47:46 -0400 (0:00:05.351) 0:01:02.486 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:47:46 -0400 (0:00:00.416) 0:01:02.902 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:47:46 -0400 (0:00:00.198) 0:01:03.101 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:47:46 -0400 (0:00:00.159) 0:01:03.260 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:47:47 -0400 (0:00:00.186) 0:01:03.447 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:47:51 -0400 (0:00:04.432) 0:01:07.880 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:47:54 -0400 (0:00:02.815) 0:01:10.696 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:47:54 -0400 (0:00:00.277) 0:01:10.973 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:47:54 -0400 (0:00:00.212) 0:01:11.185 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:47:59 -0400 (0:00:05.072) 0:01:16.257 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:48:00 -0400 (0:00:00.249) 0:01:16.507 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:48:00 -0400 (0:00:00.263) 0:01:16.771 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:48:00 -0400 (0:00:00.213) 0:01:16.985 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:48:00 -0400 (0:00:00.382) 0:01:17.367 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:100 Tuesday 13 May 2025 10:48:01 -0400 (0:00:00.223) 0:01:17.591 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:48:01 -0400 (0:00:00.525) 0:01:18.116 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:48:02 -0400 (0:00:00.548) 0:01:18.665 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:48:02 -0400 (0:00:00.292) 0:01:18.957 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:48:03 -0400 (0:00:00.563) 0:01:19.521 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:48:03 -0400 (0:00:00.198) 0:01:19.720 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:48:03 -0400 (0:00:00.220) 0:01:19.940 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:48:03 -0400 (0:00:00.198) 0:01:20.139 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:48:03 -0400 (0:00:00.215) 0:01:20.354 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:48:04 -0400 (0:00:00.605) 0:01:20.960 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:48:09 -0400 (0:00:05.148) 0:01:26.109 *********** ok: [managed-node8] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:48:09 -0400 (0:00:00.239) 0:01:26.349 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:48:10 -0400 (0:00:00.297) 0:01:26.646 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:48:15 -0400 (0:00:04.996) 0:01:31.642 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:48:15 -0400 (0:00:00.483) 0:01:32.126 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:48:15 -0400 (0:00:00.186) 0:01:32.313 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:48:16 -0400 (0:00:00.270) 0:01:32.583 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:48:16 -0400 (0:00:00.318) 0:01:32.902 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:48:20 -0400 (0:00:04.495) 0:01:37.398 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:48:23 -0400 (0:00:02.838) 0:01:40.236 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:48:24 -0400 (0:00:00.328) 0:01:40.565 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:48:24 -0400 (0:00:00.176) 0:01:40.742 *********** changed: [managed-node8] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:48:37 -0400 (0:00:13.288) 0:01:54.030 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:48:37 -0400 (0:00:00.224) 0:01:54.254 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747146747.7113545, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1747146730.2753284, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747146730.2743285, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:48:39 -0400 (0:00:01.814) 0:01:56.068 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:48:42 -0400 (0:00:02.521) 0:01:58.590 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:48:42 -0400 (0:00:00.163) 0:01:58.754 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:48:42 -0400 (0:00:00.330) 0:01:59.084 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:48:42 -0400 (0:00:00.261) 0:01:59.346 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:48:43 -0400 (0:00:00.332) 0:01:59.678 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:48:43 -0400 (0:00:00.151) 0:01:59.829 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:48:47 -0400 (0:00:04.054) 0:02:03.884 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:48:50 -0400 (0:00:02.885) 0:02:06.769 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:48:50 -0400 (0:00:00.251) 0:02:07.021 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:48:52 -0400 (0:00:01.545) 0:02:08.566 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747146524.672022, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:48:53 -0400 (0:00:01.404) 0:02:09.971 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-b96b1aea-97c1-4d41-bb07-48c1b0231627', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:48:55 -0400 (0:00:01.688) 0:02:11.659 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:112 Tuesday 13 May 2025 10:48:57 -0400 (0:00:02.012) 0:02:13.672 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 10:48:57 -0400 (0:00:00.575) 0:02:14.248 *********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 10:48:58 -0400 (0:00:00.324) 0:02:14.572 *********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 10:48:58 -0400 (0:00:00.233) 0:02:14.805 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "size": "10G", "type": "crypt", "uuid": "e714b002-a942-47cd-bed8-00a7c2045ef0" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "b96b1aea-97c1-4d41-bb07-48c1b0231627" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 10:49:01 -0400 (0:00:02.674) 0:02:17.480 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002619", "end": "2025-05-13 10:49:03.899405", "rc": 0, "start": "2025-05-13 10:49:03.896786" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 10:49:04 -0400 (0:00:03.078) 0:02:20.559 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002854", "end": "2025-05-13 10:49:05.263575", "failed_when_result": false, "rc": 0, "start": "2025-05-13 10:49:05.260721" } STDOUT: luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 10:49:05 -0400 (0:00:01.342) 0:02:21.901 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 10:49:05 -0400 (0:00:00.151) 0:02:22.053 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 10:49:06 -0400 (0:00:00.387) 0:02:22.440 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 10:49:06 -0400 (0:00:00.266) 0:02:22.706 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 10:49:07 -0400 (0:00:01.408) 0:02:24.114 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 10:49:08 -0400 (0:00:00.359) 0:02:24.474 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 10:49:08 -0400 (0:00:00.300) 0:02:24.775 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 10:49:08 -0400 (0:00:00.231) 0:02:25.006 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 10:49:08 -0400 (0:00:00.333) 0:02:25.340 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 10:49:09 -0400 (0:00:00.270) 0:02:25.610 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 10:49:09 -0400 (0:00:00.294) 0:02:25.905 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 10:49:09 -0400 (0:00:00.276) 0:02:26.181 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 10:49:09 -0400 (0:00:00.215) 0:02:26.396 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 10:49:10 -0400 (0:00:00.163) 0:02:26.560 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 10:49:10 -0400 (0:00:00.164) 0:02:26.724 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 10:49:10 -0400 (0:00:00.115) 0:02:26.840 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 10:49:10 -0400 (0:00:00.456) 0:02:27.296 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 10:49:11 -0400 (0:00:00.190) 0:02:27.486 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 10:49:11 -0400 (0:00:00.218) 0:02:27.705 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 10:49:11 -0400 (0:00:00.194) 0:02:27.899 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 10:49:11 -0400 (0:00:00.215) 0:02:28.115 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 10:49:11 -0400 (0:00:00.127) 0:02:28.243 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 10:49:12 -0400 (0:00:00.232) 0:02:28.475 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 10:49:12 -0400 (0:00:00.261) 0:02:28.737 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147717.1797948, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747147717.1797948, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36559, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747147717.1797948, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 10:49:13 -0400 (0:00:01.157) 0:02:29.894 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 10:49:13 -0400 (0:00:00.279) 0:02:30.173 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 10:49:13 -0400 (0:00:00.246) 0:02:30.420 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 10:49:14 -0400 (0:00:00.253) 0:02:30.673 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 10:49:14 -0400 (0:00:00.179) 0:02:30.853 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 10:49:14 -0400 (0:00:00.235) 0:02:31.088 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 10:49:14 -0400 (0:00:00.207) 0:02:31.296 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147717.315795, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747147717.315795, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 130167, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747147717.315795, "nlink": 1, "path": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 10:49:15 -0400 (0:00:01.045) 0:02:32.341 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 10:49:20 -0400 (0:00:04.489) 0:02:36.831 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011966", "end": "2025-05-13 10:49:21.457965", "rc": 0, "start": "2025-05-13 10:49:21.445999" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: b96b1aea-97c1-4d41-bb07-48c1b0231627 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: 9e 27 27 2f 4c ca c1 a5 9b 21 4d 97 0f 87 fa 09 12 62 b0 27 c1 38 28 52 de 3e e7 1e da 0d 07 45 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 2e c5 84 68 fe 64 85 5e 47 90 49 cf 8a d1 4c 02 49 22 a9 34 99 74 15 b7 72 83 ab 08 36 f6 e9 5b Digest: 1d c6 9a 30 e3 e2 fc d1 b0 8b 74 cf cd 4e 4d a6 3a 7b 7b a2 ec 27 23 ea 77 08 d5 1c 7d 65 b0 08 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 10:49:21 -0400 (0:00:01.312) 0:02:38.144 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 10:49:21 -0400 (0:00:00.137) 0:02:38.281 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 10:49:22 -0400 (0:00:00.319) 0:02:38.601 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 10:49:22 -0400 (0:00:00.452) 0:02:39.054 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 10:49:22 -0400 (0:00:00.273) 0:02:39.328 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 10:49:23 -0400 (0:00:00.195) 0:02:39.524 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 10:49:23 -0400 (0:00:00.273) 0:02:39.798 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 10:49:23 -0400 (0:00:00.261) 0:02:40.059 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 10:49:23 -0400 (0:00:00.238) 0:02:40.297 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 10:49:24 -0400 (0:00:00.195) 0:02:40.493 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 10:49:24 -0400 (0:00:00.175) 0:02:40.669 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 10:49:24 -0400 (0:00:00.304) 0:02:40.973 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 10:49:24 -0400 (0:00:00.263) 0:02:41.236 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 10:49:25 -0400 (0:00:00.247) 0:02:41.484 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 10:49:25 -0400 (0:00:00.327) 0:02:41.811 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 10:49:25 -0400 (0:00:00.264) 0:02:42.075 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 10:49:25 -0400 (0:00:00.202) 0:02:42.278 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 10:49:26 -0400 (0:00:00.296) 0:02:42.575 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 10:49:26 -0400 (0:00:00.278) 0:02:42.854 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 10:49:26 -0400 (0:00:00.163) 0:02:43.017 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 10:49:26 -0400 (0:00:00.159) 0:02:43.177 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 10:49:26 -0400 (0:00:00.232) 0:02:43.409 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 10:49:27 -0400 (0:00:00.177) 0:02:43.587 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 10:49:27 -0400 (0:00:00.244) 0:02:43.831 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 10:49:27 -0400 (0:00:00.191) 0:02:44.023 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 10:49:27 -0400 (0:00:00.120) 0:02:44.144 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 10:49:27 -0400 (0:00:00.259) 0:02:44.403 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 10:49:28 -0400 (0:00:00.185) 0:02:44.589 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 10:49:28 -0400 (0:00:00.265) 0:02:44.855 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 10:49:28 -0400 (0:00:00.216) 0:02:45.071 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 10:49:28 -0400 (0:00:00.254) 0:02:45.326 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 10:49:29 -0400 (0:00:00.265) 0:02:45.592 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 10:49:29 -0400 (0:00:00.251) 0:02:45.844 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 10:49:29 -0400 (0:00:00.252) 0:02:46.097 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 10:49:29 -0400 (0:00:00.165) 0:02:46.262 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.199) 0:02:46.461 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.174) 0:02:46.635 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.126) 0:02:46.762 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.134) 0:02:46.896 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.189) 0:02:47.085 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.148) 0:02:47.234 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 10:49:30 -0400 (0:00:00.165) 0:02:47.400 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 10:49:31 -0400 (0:00:00.157) 0:02:47.558 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 10:49:31 -0400 (0:00:00.126) 0:02:47.684 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 10:49:31 -0400 (0:00:00.295) 0:02:47.979 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 10:49:31 -0400 (0:00:00.252) 0:02:48.232 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 10:49:32 -0400 (0:00:00.311) 0:02:48.543 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 10:49:32 -0400 (0:00:00.339) 0:02:48.883 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 10:49:32 -0400 (0:00:00.272) 0:02:49.155 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 10:49:33 -0400 (0:00:00.284) 0:02:49.439 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 10:49:33 -0400 (0:00:00.259) 0:02:49.699 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 10:49:33 -0400 (0:00:00.281) 0:02:49.981 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 10:49:33 -0400 (0:00:00.209) 0:02:50.190 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 10:49:33 -0400 (0:00:00.239) 0:02:50.430 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 10:49:34 -0400 (0:00:00.351) 0:02:50.781 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 10:49:34 -0400 (0:00:00.269) 0:02:51.051 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 10:49:34 -0400 (0:00:00.255) 0:02:51.307 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 10:49:35 -0400 (0:00:00.275) 0:02:51.582 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 10:49:35 -0400 (0:00:00.267) 0:02:51.849 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 10:49:35 -0400 (0:00:00.388) 0:02:52.237 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:118 Tuesday 13 May 2025 10:49:37 -0400 (0:00:02.082) 0:02:54.319 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:49:38 -0400 (0:00:00.446) 0:02:54.766 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:49:38 -0400 (0:00:00.310) 0:02:55.077 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:49:38 -0400 (0:00:00.312) 0:02:55.389 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:49:39 -0400 (0:00:00.429) 0:02:55.819 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:49:39 -0400 (0:00:00.264) 0:02:56.083 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:49:40 -0400 (0:00:00.509) 0:02:56.592 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:49:40 -0400 (0:00:00.286) 0:02:56.879 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:49:40 -0400 (0:00:00.209) 0:02:57.089 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:49:40 -0400 (0:00:00.145) 0:02:57.235 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:49:41 -0400 (0:00:00.227) 0:02:57.462 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:49:41 -0400 (0:00:00.535) 0:02:57.997 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:49:46 -0400 (0:00:04.625) 0:03:02.623 *********** ok: [managed-node8] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:49:46 -0400 (0:00:00.253) 0:03:02.877 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:49:46 -0400 (0:00:00.211) 0:03:03.088 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:49:51 -0400 (0:00:05.157) 0:03:08.246 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:49:52 -0400 (0:00:00.541) 0:03:08.787 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:49:52 -0400 (0:00:00.204) 0:03:08.992 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:49:52 -0400 (0:00:00.248) 0:03:09.240 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:49:52 -0400 (0:00:00.167) 0:03:09.408 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:49:57 -0400 (0:00:04.596) 0:03:14.004 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:50:00 -0400 (0:00:03.025) 0:03:17.030 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:50:01 -0400 (0:00:00.427) 0:03:17.457 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:50:01 -0400 (0:00:00.217) 0:03:17.675 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-b96b1aea-97c1-4d41-bb07-48c1b0231627' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:50:06 -0400 (0:00:05.244) 0:03:22.919 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-b96b1aea-97c1-4d41-bb07-48c1b0231627' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:50:06 -0400 (0:00:00.261) 0:03:23.180 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:50:06 -0400 (0:00:00.175) 0:03:23.356 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:50:07 -0400 (0:00:00.265) 0:03:23.621 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:50:07 -0400 (0:00:00.277) 0:03:23.898 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 10:50:07 -0400 (0:00:00.219) 0:03:24.118 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147777.5898864, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747147777.5898864, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747147777.5898864, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3309616486", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 10:50:09 -0400 (0:00:01.642) 0:03:25.760 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Tuesday 13 May 2025 10:50:09 -0400 (0:00:00.192) 0:03:25.953 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:50:09 -0400 (0:00:00.454) 0:03:26.407 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:50:10 -0400 (0:00:00.323) 0:03:26.731 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:50:10 -0400 (0:00:00.229) 0:03:26.961 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:50:11 -0400 (0:00:00.520) 0:03:27.481 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:50:11 -0400 (0:00:00.280) 0:03:27.762 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:50:11 -0400 (0:00:00.229) 0:03:27.992 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:50:11 -0400 (0:00:00.219) 0:03:28.211 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:50:11 -0400 (0:00:00.128) 0:03:28.340 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:50:12 -0400 (0:00:00.399) 0:03:28.739 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:50:17 -0400 (0:00:04.749) 0:03:33.488 *********** ok: [managed-node8] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:50:17 -0400 (0:00:00.264) 0:03:33.753 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:50:17 -0400 (0:00:00.267) 0:03:34.020 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:50:22 -0400 (0:00:05.387) 0:03:39.408 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:50:23 -0400 (0:00:00.270) 0:03:39.678 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:50:23 -0400 (0:00:00.422) 0:03:40.100 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:50:23 -0400 (0:00:00.244) 0:03:40.345 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:50:24 -0400 (0:00:00.176) 0:03:40.522 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:50:28 -0400 (0:00:04.681) 0:03:45.204 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:50:31 -0400 (0:00:03.170) 0:03:48.374 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:50:32 -0400 (0:00:00.466) 0:03:48.841 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:50:32 -0400 (0:00:00.292) 0:03:49.134 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:50:38 -0400 (0:00:05.797) 0:03:54.931 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:50:38 -0400 (0:00:00.263) 0:03:55.194 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147730.0398142, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e91450c47cf29d287e4cc5315c11341240659429", "ctime": 1747147730.0368145, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747147730.0368145, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:50:40 -0400 (0:00:01.564) 0:03:56.759 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:50:41 -0400 (0:00:01.540) 0:03:58.299 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:50:42 -0400 (0:00:00.172) 0:03:58.471 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:50:42 -0400 (0:00:00.223) 0:03:58.695 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:50:42 -0400 (0:00:00.197) 0:03:58.893 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:50:42 -0400 (0:00:00.132) 0:03:59.026 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b96b1aea-97c1-4d41-bb07-48c1b0231627" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:50:44 -0400 (0:00:01.583) 0:04:00.609 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:50:46 -0400 (0:00:01.969) 0:04:02.579 *********** changed: [managed-node8] => (item={'src': 'UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:50:48 -0400 (0:00:01.884) 0:04:04.463 *********** skipping: [managed-node8] => (item={'src': 'UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:50:48 -0400 (0:00:00.386) 0:04:04.849 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:50:50 -0400 (0:00:01.946) 0:04:06.796 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147745.2628374, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b59174efc1121f902cd68a480bc5756cbcf6fa5d", "ctime": 1747147734.9158218, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 492830856, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747147734.9138217, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2210735651", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:50:51 -0400 (0:00:01.585) 0:04:08.382 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-b96b1aea-97c1-4d41-bb07-48c1b0231627', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:50:53 -0400 (0:00:01.634) 0:04:10.017 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:151 Tuesday 13 May 2025 10:50:56 -0400 (0:00:02.441) 0:04:12.458 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 10:50:56 -0400 (0:00:00.607) 0:04:13.065 *********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 10:50:56 -0400 (0:00:00.292) 0:04:13.358 *********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 10:50:57 -0400 (0:00:00.310) 0:04:13.668 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "b7f02da6-7c95-4688-a4ed-4c725c35bc0e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 10:50:58 -0400 (0:00:01.628) 0:04:15.296 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002605", "end": "2025-05-13 10:51:00.054758", "rc": 0, "start": "2025-05-13 10:51:00.052153" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 10:51:00 -0400 (0:00:01.533) 0:04:16.829 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002693", "end": "2025-05-13 10:51:01.514289", "failed_when_result": false, "rc": 0, "start": "2025-05-13 10:51:01.511596" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 10:51:01 -0400 (0:00:01.415) 0:04:18.245 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 10:51:01 -0400 (0:00:00.178) 0:04:18.423 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 10:51:02 -0400 (0:00:00.459) 0:04:18.882 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 10:51:02 -0400 (0:00:00.232) 0:04:19.115 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 10:51:03 -0400 (0:00:01.199) 0:04:20.314 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 10:51:04 -0400 (0:00:00.313) 0:04:20.628 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 10:51:04 -0400 (0:00:00.294) 0:04:20.923 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 10:51:04 -0400 (0:00:00.229) 0:04:21.152 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 10:51:04 -0400 (0:00:00.271) 0:04:21.423 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 10:51:05 -0400 (0:00:00.278) 0:04:21.702 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 10:51:05 -0400 (0:00:00.249) 0:04:21.951 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 10:51:05 -0400 (0:00:00.265) 0:04:22.217 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 10:51:05 -0400 (0:00:00.212) 0:04:22.429 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 10:51:06 -0400 (0:00:00.144) 0:04:22.573 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 10:51:06 -0400 (0:00:00.214) 0:04:22.788 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 10:51:06 -0400 (0:00:00.214) 0:04:23.003 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 10:51:06 -0400 (0:00:00.374) 0:04:23.378 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 10:51:07 -0400 (0:00:00.223) 0:04:23.602 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 10:51:07 -0400 (0:00:00.251) 0:04:23.854 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 10:51:07 -0400 (0:00:00.235) 0:04:24.089 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 10:51:07 -0400 (0:00:00.198) 0:04:24.287 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 10:51:08 -0400 (0:00:00.253) 0:04:24.541 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 10:51:08 -0400 (0:00:00.695) 0:04:25.237 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 10:51:09 -0400 (0:00:00.245) 0:04:25.483 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147838.120978, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747147838.120978, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36559, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747147838.120978, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 10:51:10 -0400 (0:00:01.593) 0:04:27.077 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 10:51:10 -0400 (0:00:00.221) 0:04:27.298 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 10:51:11 -0400 (0:00:00.275) 0:04:27.574 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 10:51:11 -0400 (0:00:00.341) 0:04:27.916 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 10:51:11 -0400 (0:00:00.231) 0:04:28.148 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 10:51:11 -0400 (0:00:00.208) 0:04:28.356 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 10:51:12 -0400 (0:00:00.269) 0:04:28.626 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 10:51:12 -0400 (0:00:00.291) 0:04:28.917 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 10:51:17 -0400 (0:00:04.617) 0:04:33.535 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 10:51:17 -0400 (0:00:00.276) 0:04:33.811 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 10:51:17 -0400 (0:00:00.296) 0:04:34.108 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 10:51:18 -0400 (0:00:00.347) 0:04:34.455 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 10:51:18 -0400 (0:00:00.221) 0:04:34.677 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 10:51:18 -0400 (0:00:00.303) 0:04:34.980 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 10:51:18 -0400 (0:00:00.296) 0:04:35.277 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 10:51:19 -0400 (0:00:00.244) 0:04:35.521 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 10:51:19 -0400 (0:00:00.280) 0:04:35.802 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 10:51:19 -0400 (0:00:00.332) 0:04:36.135 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 10:51:19 -0400 (0:00:00.289) 0:04:36.424 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 10:51:20 -0400 (0:00:00.231) 0:04:36.656 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 10:51:20 -0400 (0:00:00.238) 0:04:36.894 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 10:51:20 -0400 (0:00:00.245) 0:04:37.140 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 10:51:20 -0400 (0:00:00.220) 0:04:37.361 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 10:51:21 -0400 (0:00:00.312) 0:04:37.673 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 10:51:21 -0400 (0:00:00.381) 0:04:38.054 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 10:51:21 -0400 (0:00:00.295) 0:04:38.350 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 10:51:22 -0400 (0:00:00.220) 0:04:38.571 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 10:51:22 -0400 (0:00:00.260) 0:04:38.832 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 10:51:22 -0400 (0:00:00.180) 0:04:39.012 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 10:51:22 -0400 (0:00:00.171) 0:04:39.183 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 10:51:22 -0400 (0:00:00.174) 0:04:39.358 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 10:51:23 -0400 (0:00:00.160) 0:04:39.519 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 10:51:23 -0400 (0:00:00.207) 0:04:39.726 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 10:51:23 -0400 (0:00:00.170) 0:04:39.897 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 10:51:23 -0400 (0:00:00.201) 0:04:40.099 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 10:51:23 -0400 (0:00:00.199) 0:04:40.298 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 10:51:24 -0400 (0:00:00.249) 0:04:40.548 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 10:51:24 -0400 (0:00:00.251) 0:04:40.799 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 10:51:24 -0400 (0:00:00.247) 0:04:41.047 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 10:51:24 -0400 (0:00:00.181) 0:04:41.228 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 10:51:24 -0400 (0:00:00.175) 0:04:41.404 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 10:51:25 -0400 (0:00:00.221) 0:04:41.626 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 10:51:25 -0400 (0:00:00.201) 0:04:41.827 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 10:51:25 -0400 (0:00:00.258) 0:04:42.086 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 10:51:25 -0400 (0:00:00.294) 0:04:42.380 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 10:51:26 -0400 (0:00:00.309) 0:04:42.690 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 10:51:26 -0400 (0:00:00.256) 0:04:42.946 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 10:51:26 -0400 (0:00:00.236) 0:04:43.182 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 10:51:26 -0400 (0:00:00.188) 0:04:43.371 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 10:51:27 -0400 (0:00:00.219) 0:04:43.591 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 10:51:27 -0400 (0:00:00.312) 0:04:43.903 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 10:51:27 -0400 (0:00:00.328) 0:04:44.231 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 10:51:28 -0400 (0:00:00.289) 0:04:44.521 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 10:51:28 -0400 (0:00:00.318) 0:04:44.840 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 10:51:28 -0400 (0:00:00.339) 0:04:45.179 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 10:51:28 -0400 (0:00:00.254) 0:04:45.433 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 10:51:29 -0400 (0:00:00.208) 0:04:45.642 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 10:51:29 -0400 (0:00:00.236) 0:04:45.878 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 10:51:29 -0400 (0:00:00.316) 0:04:46.195 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 10:51:30 -0400 (0:00:00.344) 0:04:46.539 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 10:51:30 -0400 (0:00:00.285) 0:04:46.824 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 10:51:30 -0400 (0:00:00.185) 0:04:47.010 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 10:51:30 -0400 (0:00:00.278) 0:04:47.288 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 10:51:31 -0400 (0:00:00.668) 0:04:47.957 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 10:51:31 -0400 (0:00:00.196) 0:04:48.154 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 10:51:31 -0400 (0:00:00.218) 0:04:48.373 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 10:51:32 -0400 (0:00:00.283) 0:04:48.656 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 10:51:32 -0400 (0:00:00.178) 0:04:48.835 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 10:51:32 -0400 (0:00:00.170) 0:04:49.006 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:157 Tuesday 13 May 2025 10:51:34 -0400 (0:00:01.600) 0:04:50.607 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:51:34 -0400 (0:00:00.439) 0:04:51.046 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:51:34 -0400 (0:00:00.375) 0:04:51.422 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:51:35 -0400 (0:00:00.424) 0:04:51.846 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:51:35 -0400 (0:00:00.366) 0:04:52.213 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:51:36 -0400 (0:00:00.293) 0:04:52.507 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:51:36 -0400 (0:00:00.481) 0:04:52.989 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:51:36 -0400 (0:00:00.197) 0:04:53.186 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:51:36 -0400 (0:00:00.240) 0:04:53.426 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:51:37 -0400 (0:00:00.217) 0:04:53.644 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:51:37 -0400 (0:00:00.191) 0:04:53.836 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:51:37 -0400 (0:00:00.455) 0:04:54.292 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:51:41 -0400 (0:00:03.974) 0:04:58.266 *********** ok: [managed-node8] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:51:42 -0400 (0:00:00.259) 0:04:58.525 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:51:42 -0400 (0:00:00.289) 0:04:58.815 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:51:47 -0400 (0:00:05.254) 0:05:04.069 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:51:48 -0400 (0:00:00.491) 0:05:04.560 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:51:48 -0400 (0:00:00.258) 0:05:04.819 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:51:48 -0400 (0:00:00.233) 0:05:05.052 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:51:48 -0400 (0:00:00.183) 0:05:05.236 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:51:53 -0400 (0:00:04.397) 0:05:09.634 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service": { "name": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service": { "name": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:51:56 -0400 (0:00:02.830) 0:05:12.464 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:51:56 -0400 (0:00:00.239) 0:05:12.704 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db96b1aea\x2d97c1\x2d4d41\x2dbb07\x2d48c1b0231627.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "name": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-b96b1aea-97c1-4d41-bb07-48c1b0231627", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b96b1aea-97c1-4d41-bb07-48c1b0231627 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 10:50:49 EDT", "StateChangeTimestampMonotonic": "2008006775", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d97c1\x2d4d41\x2dbb07\x2d48c1b0231627.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "name": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:51:59 -0400 (0:00:02.972) 0:05:15.676 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:52:03 -0400 (0:00:04.734) 0:05:20.411 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:52:04 -0400 (0:00:00.281) 0:05:20.693 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2db96b1aea\x2d97c1\x2d4d41\x2dbb07\x2d48c1b0231627.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "name": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2db96b1aea\\x2d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d97c1\x2d4d41\x2dbb07\x2d48c1b0231627.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "name": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d97c1\\x2d4d41\\x2dbb07\\x2d48c1b0231627.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:52:07 -0400 (0:00:02.767) 0:05:23.460 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:52:07 -0400 (0:00:00.184) 0:05:23.645 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:52:07 -0400 (0:00:00.272) 0:05:23.917 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 10:52:07 -0400 (0:00:00.237) 0:05:24.155 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147893.8160627, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747147893.8160627, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747147893.8160627, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2763018594", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 10:52:09 -0400 (0:00:01.367) 0:05:25.523 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:177 Tuesday 13 May 2025 10:52:09 -0400 (0:00:00.239) 0:05:25.762 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:52:10 -0400 (0:00:00.914) 0:05:26.677 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:52:10 -0400 (0:00:00.402) 0:05:27.079 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:52:10 -0400 (0:00:00.287) 0:05:27.366 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:52:11 -0400 (0:00:00.591) 0:05:27.958 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:52:11 -0400 (0:00:00.188) 0:05:28.146 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:52:11 -0400 (0:00:00.178) 0:05:28.325 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:52:12 -0400 (0:00:00.793) 0:05:29.118 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:52:12 -0400 (0:00:00.197) 0:05:29.315 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:52:13 -0400 (0:00:00.533) 0:05:29.849 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:52:18 -0400 (0:00:04.644) 0:05:34.493 *********** ok: [managed-node8] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:52:18 -0400 (0:00:00.309) 0:05:34.802 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:52:18 -0400 (0:00:00.327) 0:05:35.130 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:52:23 -0400 (0:00:05.192) 0:05:40.322 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:52:24 -0400 (0:00:00.231) 0:05:40.554 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:52:24 -0400 (0:00:00.204) 0:05:40.759 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:52:24 -0400 (0:00:00.207) 0:05:40.967 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:52:24 -0400 (0:00:00.189) 0:05:41.157 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:52:29 -0400 (0:00:04.712) 0:05:45.869 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:52:32 -0400 (0:00:02.595) 0:05:48.465 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:52:32 -0400 (0:00:00.282) 0:05:48.747 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:52:32 -0400 (0:00:00.338) 0:05:49.086 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:52:45 -0400 (0:00:13.175) 0:06:02.261 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:52:45 -0400 (0:00:00.167) 0:06:02.429 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147847.6839926, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e62aea0a73163f4a699ed0e6ba08dddbbd9ef3b4", "ctime": 1747147847.6809926, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747147847.6809926, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:52:47 -0400 (0:00:01.450) 0:06:03.879 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:52:49 -0400 (0:00:01.583) 0:06:05.463 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:52:49 -0400 (0:00:00.178) 0:06:05.642 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:52:49 -0400 (0:00:00.386) 0:06:06.028 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:52:49 -0400 (0:00:00.176) 0:06:06.205 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:52:50 -0400 (0:00:00.320) 0:06:06.525 *********** changed: [managed-node8] => (item={'src': 'UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=b7f02da6-7c95-4688-a4ed-4c725c35bc0e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:52:51 -0400 (0:00:01.311) 0:06:07.837 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:52:53 -0400 (0:00:01.765) 0:06:09.602 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:52:54 -0400 (0:00:01.580) 0:06:11.183 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:52:54 -0400 (0:00:00.169) 0:06:11.352 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:52:56 -0400 (0:00:01.701) 0:06:13.054 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147861.5130136, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747147853.214001, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 85983435, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1747147853.213001, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3312789539", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:52:58 -0400 (0:00:01.642) 0:06:14.697 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-e455c4cc-634c-45f5-b789-65424c827f52', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:52:59 -0400 (0:00:01.578) 0:06:16.275 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:190 Tuesday 13 May 2025 10:53:01 -0400 (0:00:01.607) 0:06:17.883 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 10:53:01 -0400 (0:00:00.475) 0:06:18.359 *********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 10:53:02 -0400 (0:00:00.159) 0:06:18.519 *********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 10:53:02 -0400 (0:00:00.237) 0:06:18.756 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "size": "10G", "type": "crypt", "uuid": "e66fe618-a253-4ecd-b3df-94572d0056dd" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e455c4cc-634c-45f5-b789-65424c827f52" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 10:53:03 -0400 (0:00:01.315) 0:06:20.072 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003169", "end": "2025-05-13 10:53:04.759647", "rc": 0, "start": "2025-05-13 10:53:04.756478" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 10:53:04 -0400 (0:00:01.355) 0:06:21.427 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003005", "end": "2025-05-13 10:53:06.215524", "failed_when_result": false, "rc": 0, "start": "2025-05-13 10:53:06.212519" } STDOUT: luks-e455c4cc-634c-45f5-b789-65424c827f52 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 10:53:06 -0400 (0:00:01.514) 0:06:22.941 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 10:53:06 -0400 (0:00:00.177) 0:06:23.119 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 10:53:07 -0400 (0:00:00.532) 0:06:23.651 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 10:53:07 -0400 (0:00:00.328) 0:06:23.980 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 10:53:08 -0400 (0:00:01.098) 0:06:25.078 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 10:53:08 -0400 (0:00:00.252) 0:06:25.331 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 10:53:09 -0400 (0:00:00.281) 0:06:25.612 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 10:53:09 -0400 (0:00:00.064) 0:06:25.676 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 10:53:09 -0400 (0:00:00.285) 0:06:25.962 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 10:53:09 -0400 (0:00:00.178) 0:06:26.140 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 10:53:09 -0400 (0:00:00.194) 0:06:26.334 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 10:53:10 -0400 (0:00:00.543) 0:06:26.878 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 10:53:10 -0400 (0:00:00.190) 0:06:27.068 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 10:53:10 -0400 (0:00:00.104) 0:06:27.173 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 10:53:10 -0400 (0:00:00.158) 0:06:27.331 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 10:53:11 -0400 (0:00:00.146) 0:06:27.478 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 10:53:11 -0400 (0:00:00.299) 0:06:27.777 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 10:53:11 -0400 (0:00:00.231) 0:06:28.009 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 10:53:11 -0400 (0:00:00.215) 0:06:28.224 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 10:53:11 -0400 (0:00:00.130) 0:06:28.355 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 10:53:12 -0400 (0:00:00.135) 0:06:28.490 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 10:53:12 -0400 (0:00:00.138) 0:06:28.629 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 10:53:12 -0400 (0:00:00.239) 0:06:28.868 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 10:53:12 -0400 (0:00:00.266) 0:06:29.135 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147965.3871713, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747147965.3871713, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36559, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747147965.3871713, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 10:53:13 -0400 (0:00:01.240) 0:06:30.375 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 10:53:14 -0400 (0:00:00.321) 0:06:30.697 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 10:53:14 -0400 (0:00:00.236) 0:06:30.934 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 10:53:14 -0400 (0:00:00.217) 0:06:31.151 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 10:53:14 -0400 (0:00:00.280) 0:06:31.432 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 10:53:15 -0400 (0:00:00.342) 0:06:31.774 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 10:53:15 -0400 (0:00:00.286) 0:06:32.061 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147965.5091715, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747147965.5091715, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 156967, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747147965.5091715, "nlink": 1, "path": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 10:53:16 -0400 (0:00:01.271) 0:06:33.332 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 10:53:21 -0400 (0:00:04.556) 0:06:37.889 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010684", "end": "2025-05-13 10:53:22.678863", "rc": 0, "start": "2025-05-13 10:53:22.668179" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: e455c4cc-634c-45f5-b789-65424c827f52 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: 4d a5 76 c7 01 c7 0a d5 9f 74 8b cc ed a8 8f b6 4c ae c3 44 d9 8c 99 a0 bb fb d2 4c c1 69 d2 35 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 40 1b 88 7f 5d 38 ba 9d 42 61 fc bd 92 0a 33 c0 19 dc 82 0d f7 80 b0 f6 e2 7b 02 7c 0f 50 7b 2c Digest: 48 12 62 14 86 f9 3e 16 c3 30 fe 49 cf 7b e4 87 45 07 de db fa ef ef 8c 87 03 41 df 4e 02 68 40 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 10:53:22 -0400 (0:00:01.513) 0:06:39.402 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 10:53:23 -0400 (0:00:00.301) 0:06:39.704 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 10:53:23 -0400 (0:00:00.297) 0:06:40.002 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 10:53:23 -0400 (0:00:00.395) 0:06:40.398 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 10:53:24 -0400 (0:00:00.279) 0:06:40.678 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 10:53:24 -0400 (0:00:00.266) 0:06:40.944 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 10:53:24 -0400 (0:00:00.291) 0:06:41.235 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 10:53:25 -0400 (0:00:00.292) 0:06:41.528 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e455c4cc-634c-45f5-b789-65424c827f52 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 10:53:25 -0400 (0:00:00.300) 0:06:41.829 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 10:53:25 -0400 (0:00:00.248) 0:06:42.077 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 10:53:26 -0400 (0:00:00.362) 0:06:42.440 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 10:53:26 -0400 (0:00:00.381) 0:06:42.821 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 10:53:26 -0400 (0:00:00.367) 0:06:43.189 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 10:53:26 -0400 (0:00:00.181) 0:06:43.371 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 10:53:27 -0400 (0:00:00.219) 0:06:43.590 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 10:53:27 -0400 (0:00:00.176) 0:06:43.767 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 10:53:27 -0400 (0:00:00.214) 0:06:43.981 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 10:53:27 -0400 (0:00:00.280) 0:06:44.262 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 10:53:28 -0400 (0:00:00.299) 0:06:44.561 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 10:53:28 -0400 (0:00:00.281) 0:06:44.843 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 10:53:28 -0400 (0:00:00.253) 0:06:45.096 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 10:53:28 -0400 (0:00:00.217) 0:06:45.314 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 10:53:29 -0400 (0:00:00.237) 0:06:45.552 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 10:53:29 -0400 (0:00:00.403) 0:06:45.955 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 10:53:29 -0400 (0:00:00.309) 0:06:46.265 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 10:53:30 -0400 (0:00:00.220) 0:06:46.486 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 10:53:30 -0400 (0:00:00.348) 0:06:46.834 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 10:53:30 -0400 (0:00:00.325) 0:06:47.160 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 10:53:31 -0400 (0:00:00.347) 0:06:47.508 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 10:53:31 -0400 (0:00:00.288) 0:06:47.797 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 10:53:31 -0400 (0:00:00.314) 0:06:48.111 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 10:53:32 -0400 (0:00:00.545) 0:06:48.656 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 10:53:32 -0400 (0:00:00.214) 0:06:48.871 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 10:53:32 -0400 (0:00:00.205) 0:06:49.077 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 10:53:32 -0400 (0:00:00.196) 0:06:49.274 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 10:53:33 -0400 (0:00:00.262) 0:06:49.536 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 10:53:33 -0400 (0:00:00.219) 0:06:49.756 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 10:53:33 -0400 (0:00:00.265) 0:06:50.022 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 10:53:33 -0400 (0:00:00.232) 0:06:50.255 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.274) 0:06:50.529 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.267) 0:06:50.796 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.207) 0:06:51.004 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.212) 0:06:51.217 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.111) 0:06:51.328 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 10:53:34 -0400 (0:00:00.097) 0:06:51.425 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.078) 0:06:51.504 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.048) 0:06:51.553 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.174) 0:06:51.727 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.249) 0:06:51.977 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.265) 0:06:52.243 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 10:53:35 -0400 (0:00:00.128) 0:06:52.371 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.138) 0:06:52.510 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.116) 0:06:52.626 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.184) 0:06:52.811 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.155) 0:06:52.967 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.182) 0:06:53.149 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 10:53:36 -0400 (0:00:00.198) 0:06:53.347 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 10:53:37 -0400 (0:00:00.166) 0:06:53.513 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 10:53:37 -0400 (0:00:00.132) 0:06:53.646 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Tuesday 13 May 2025 10:53:37 -0400 (0:00:00.141) 0:06:53.788 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:53:37 -0400 (0:00:00.529) 0:06:54.317 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:53:38 -0400 (0:00:00.272) 0:06:54.590 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:53:38 -0400 (0:00:00.237) 0:06:54.827 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:53:38 -0400 (0:00:00.262) 0:06:55.090 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:53:38 -0400 (0:00:00.103) 0:06:55.193 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:53:39 -0400 (0:00:00.333) 0:06:55.527 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:53:39 -0400 (0:00:00.335) 0:06:55.863 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:53:39 -0400 (0:00:00.267) 0:06:56.131 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:53:39 -0400 (0:00:00.160) 0:06:56.291 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:53:40 -0400 (0:00:00.255) 0:06:56.546 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:53:40 -0400 (0:00:00.540) 0:06:57.087 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:53:45 -0400 (0:00:04.814) 0:07:01.902 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:53:45 -0400 (0:00:00.267) 0:07:02.169 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:53:46 -0400 (0:00:00.337) 0:07:02.507 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:53:51 -0400 (0:00:05.089) 0:07:07.596 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:53:51 -0400 (0:00:00.498) 0:07:08.094 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:53:51 -0400 (0:00:00.257) 0:07:08.352 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:53:52 -0400 (0:00:00.252) 0:07:08.605 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:53:52 -0400 (0:00:00.223) 0:07:08.828 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:53:57 -0400 (0:00:04.704) 0:07:13.533 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:53:59 -0400 (0:00:02.860) 0:07:16.393 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:54:00 -0400 (0:00:00.336) 0:07:16.730 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:54:00 -0400 (0:00:00.229) 0:07:16.959 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:54:05 -0400 (0:00:05.246) 0:07:22.206 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:54:05 -0400 (0:00:00.115) 0:07:22.321 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:54:06 -0400 (0:00:00.169) 0:07:22.491 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:54:06 -0400 (0:00:00.198) 0:07:22.689 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:54:06 -0400 (0:00:00.306) 0:07:22.996 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:216 Tuesday 13 May 2025 10:54:06 -0400 (0:00:00.159) 0:07:23.155 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:54:07 -0400 (0:00:00.815) 0:07:23.971 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:54:08 -0400 (0:00:00.736) 0:07:24.707 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:54:08 -0400 (0:00:00.155) 0:07:24.863 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:54:08 -0400 (0:00:00.484) 0:07:25.348 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:54:09 -0400 (0:00:00.224) 0:07:25.572 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:54:09 -0400 (0:00:00.288) 0:07:25.861 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:54:09 -0400 (0:00:00.138) 0:07:25.999 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:54:09 -0400 (0:00:00.135) 0:07:26.134 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:54:10 -0400 (0:00:00.536) 0:07:26.670 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:54:15 -0400 (0:00:04.966) 0:07:31.637 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:54:15 -0400 (0:00:00.305) 0:07:31.943 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:54:15 -0400 (0:00:00.256) 0:07:32.199 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:54:21 -0400 (0:00:05.442) 0:07:37.642 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:54:21 -0400 (0:00:00.363) 0:07:38.006 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:54:21 -0400 (0:00:00.191) 0:07:38.197 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:54:21 -0400 (0:00:00.172) 0:07:38.370 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:54:22 -0400 (0:00:00.216) 0:07:38.586 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:54:26 -0400 (0:00:04.620) 0:07:43.206 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:54:29 -0400 (0:00:03.137) 0:07:46.343 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:54:30 -0400 (0:00:00.314) 0:07:46.658 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:54:30 -0400 (0:00:00.191) 0:07:46.849 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:54:44 -0400 (0:00:14.230) 0:08:01.079 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:54:44 -0400 (0:00:00.315) 0:08:01.394 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147974.4431849, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ef40d7ba867f0a893fd497b6ccd40309819a5400", "ctime": 1747147974.440185, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747147974.440185, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:54:46 -0400 (0:00:01.569) 0:08:02.964 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:54:48 -0400 (0:00:01.550) 0:08:04.514 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:54:48 -0400 (0:00:00.142) 0:08:04.657 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:54:48 -0400 (0:00:00.214) 0:08:04.871 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:54:48 -0400 (0:00:00.178) 0:08:05.050 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:54:48 -0400 (0:00:00.193) 0:08:05.243 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e455c4cc-634c-45f5-b789-65424c827f52" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:54:50 -0400 (0:00:01.521) 0:08:06.764 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:54:52 -0400 (0:00:01.983) 0:08:08.747 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:54:54 -0400 (0:00:02.029) 0:08:10.777 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:54:54 -0400 (0:00:00.357) 0:08:11.134 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:54:56 -0400 (0:00:01.913) 0:08:13.048 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747147986.214203, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1c7b37a55ce14b4e5e6fddb024aab0d81900a425", "ctime": 1747147979.5231926, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 213909707, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747147979.5211926, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1056601526", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:54:58 -0400 (0:00:01.477) 0:08:14.525 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda', 'name': 'luks-e455c4cc-634c-45f5-b789-65424c827f52', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-e455c4cc-634c-45f5-b789-65424c827f52", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-5c6835a5-2263-482b-9ceb-7d65172113bc', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:55:00 -0400 (0:00:02.632) 0:08:17.157 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:233 Tuesday 13 May 2025 10:55:02 -0400 (0:00:01.792) 0:08:18.950 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 10:55:03 -0400 (0:00:00.493) 0:08:19.443 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 10:55:03 -0400 (0:00:00.395) 0:08:19.839 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 10:55:03 -0400 (0:00:00.124) 0:08:19.963 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "size": "10G", "type": "crypt", "uuid": "2d5a198e-30c8-4e0b-8f51-43f8f8ebbcdf" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "5c6835a5-2263-482b-9ceb-7d65172113bc" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 10:55:04 -0400 (0:00:01.221) 0:08:21.185 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002614", "end": "2025-05-13 10:55:05.878617", "rc": 0, "start": "2025-05-13 10:55:05.876003" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 10:55:06 -0400 (0:00:01.386) 0:08:22.572 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002587", "end": "2025-05-13 10:55:07.169824", "failed_when_result": false, "rc": 0, "start": "2025-05-13 10:55:07.167237" } STDOUT: luks-5c6835a5-2263-482b-9ceb-7d65172113bc /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 10:55:07 -0400 (0:00:01.324) 0:08:23.897 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 10:55:07 -0400 (0:00:00.470) 0:08:24.368 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 10:55:08 -0400 (0:00:00.247) 0:08:24.615 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 10:55:08 -0400 (0:00:00.291) 0:08:24.906 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 10:55:09 -0400 (0:00:00.738) 0:08:25.645 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 10:55:09 -0400 (0:00:00.556) 0:08:26.201 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 10:55:10 -0400 (0:00:00.267) 0:08:26.469 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 10:55:10 -0400 (0:00:00.479) 0:08:26.949 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 10:55:10 -0400 (0:00:00.312) 0:08:27.261 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 10:55:11 -0400 (0:00:00.295) 0:08:27.557 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 10:55:11 -0400 (0:00:00.299) 0:08:27.857 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 10:55:11 -0400 (0:00:00.314) 0:08:28.171 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 10:55:12 -0400 (0:00:00.317) 0:08:28.488 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 10:55:12 -0400 (0:00:00.216) 0:08:28.705 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 10:55:12 -0400 (0:00:00.342) 0:08:29.047 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 10:55:14 -0400 (0:00:01.641) 0:08:30.688 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 10:55:14 -0400 (0:00:00.199) 0:08:30.888 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 10:55:14 -0400 (0:00:00.544) 0:08:31.433 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 10:55:15 -0400 (0:00:00.193) 0:08:31.626 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 10:55:15 -0400 (0:00:00.187) 0:08:31.813 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 10:55:15 -0400 (0:00:00.158) 0:08:31.972 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 10:55:15 -0400 (0:00:00.251) 0:08:32.223 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 10:55:15 -0400 (0:00:00.201) 0:08:32.425 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 10:55:16 -0400 (0:00:00.168) 0:08:32.594 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 10:55:16 -0400 (0:00:00.216) 0:08:32.810 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 10:55:16 -0400 (0:00:00.285) 0:08:33.095 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 10:55:16 -0400 (0:00:00.307) 0:08:33.403 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 10:55:17 -0400 (0:00:00.412) 0:08:33.815 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 10:55:17 -0400 (0:00:00.360) 0:08:34.176 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 10:55:18 -0400 (0:00:00.505) 0:08:34.681 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 10:55:18 -0400 (0:00:00.356) 0:08:35.038 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 10:55:19 -0400 (0:00:00.441) 0:08:35.479 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 10:55:19 -0400 (0:00:00.138) 0:08:35.617 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 10:55:19 -0400 (0:00:00.535) 0:08:36.153 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 10:55:20 -0400 (0:00:00.417) 0:08:36.570 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 10:55:20 -0400 (0:00:00.320) 0:08:36.890 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 10:55:20 -0400 (0:00:00.235) 0:08:37.126 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 10:55:20 -0400 (0:00:00.166) 0:08:37.293 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 10:55:21 -0400 (0:00:00.345) 0:08:37.639 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 10:55:21 -0400 (0:00:00.280) 0:08:37.920 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 10:55:22 -0400 (0:00:00.729) 0:08:38.649 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 10:55:22 -0400 (0:00:00.267) 0:08:38.917 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 10:55:22 -0400 (0:00:00.280) 0:08:39.198 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 10:55:22 -0400 (0:00:00.187) 0:08:39.385 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 10:55:23 -0400 (0:00:00.186) 0:08:39.571 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 10:55:23 -0400 (0:00:00.222) 0:08:39.794 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 10:55:23 -0400 (0:00:00.194) 0:08:39.988 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 10:55:23 -0400 (0:00:00.159) 0:08:40.148 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 10:55:23 -0400 (0:00:00.147) 0:08:40.296 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 10:55:24 -0400 (0:00:00.557) 0:08:40.854 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 10:55:24 -0400 (0:00:00.354) 0:08:41.208 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 10:55:26 -0400 (0:00:01.359) 0:08:42.567 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 10:55:26 -0400 (0:00:00.333) 0:08:42.901 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 10:55:26 -0400 (0:00:00.340) 0:08:43.242 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 10:55:27 -0400 (0:00:00.216) 0:08:43.458 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 10:55:27 -0400 (0:00:00.248) 0:08:43.707 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 10:55:27 -0400 (0:00:00.262) 0:08:43.969 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 10:55:27 -0400 (0:00:00.232) 0:08:44.202 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 10:55:28 -0400 (0:00:00.274) 0:08:44.477 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 10:55:28 -0400 (0:00:00.273) 0:08:44.750 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 10:55:28 -0400 (0:00:00.280) 0:08:45.031 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 10:55:28 -0400 (0:00:00.218) 0:08:45.250 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 10:55:29 -0400 (0:00:00.282) 0:08:45.532 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 10:55:29 -0400 (0:00:00.447) 0:08:45.980 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 10:55:29 -0400 (0:00:00.300) 0:08:46.280 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 10:55:30 -0400 (0:00:00.282) 0:08:46.562 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 10:55:30 -0400 (0:00:00.268) 0:08:46.830 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 10:55:30 -0400 (0:00:00.277) 0:08:47.108 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 10:55:30 -0400 (0:00:00.129) 0:08:47.238 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 10:55:31 -0400 (0:00:00.235) 0:08:47.473 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 10:55:31 -0400 (0:00:00.331) 0:08:47.805 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148084.0333512, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148084.0333512, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 169792, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747148084.0333512, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 10:55:32 -0400 (0:00:01.361) 0:08:49.166 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 10:55:32 -0400 (0:00:00.248) 0:08:49.415 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 10:55:33 -0400 (0:00:00.256) 0:08:49.672 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 10:55:33 -0400 (0:00:00.268) 0:08:49.940 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 10:55:33 -0400 (0:00:00.232) 0:08:50.172 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 10:55:34 -0400 (0:00:00.751) 0:08:50.924 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 10:55:34 -0400 (0:00:00.266) 0:08:51.191 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148084.1833515, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148084.1833515, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 168921, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148084.1833515, "nlink": 1, "path": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 10:55:36 -0400 (0:00:01.498) 0:08:52.690 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 10:55:40 -0400 (0:00:04.598) 0:08:57.288 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.011113", "end": "2025-05-13 10:55:42.131108", "rc": 0, "start": "2025-05-13 10:55:42.119995" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 5c6835a5-2263-482b-9ceb-7d65172113bc Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930909 Threads: 2 Salt: 36 2f 52 04 fe 71 a9 03 82 dd 08 3c 1f eb 85 12 90 bc 00 b5 da 59 32 91 ce 6b 74 86 92 1b df e1 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: d4 bb 05 a6 ba 73 ff 89 29 41 da ea 4a d4 b4 1c 7d 87 54 70 3c 46 f9 d4 20 78 a7 fd 0d a0 76 af Digest: d4 ff dd 41 a3 d5 d9 64 52 5b 0b 03 00 d6 19 36 77 43 91 bf 64 07 b7 e6 fd ce c1 4e 12 9a 79 9d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 10:55:42 -0400 (0:00:01.540) 0:08:58.829 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 10:55:42 -0400 (0:00:00.302) 0:08:59.132 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 10:55:42 -0400 (0:00:00.276) 0:08:59.409 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 10:55:43 -0400 (0:00:00.165) 0:08:59.574 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 10:55:43 -0400 (0:00:00.253) 0:08:59.827 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 10:55:43 -0400 (0:00:00.219) 0:09:00.047 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 10:55:43 -0400 (0:00:00.342) 0:09:00.389 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 10:55:44 -0400 (0:00:00.251) 0:09:00.641 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5c6835a5-2263-482b-9ceb-7d65172113bc /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 10:55:44 -0400 (0:00:00.367) 0:09:01.008 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 10:55:44 -0400 (0:00:00.308) 0:09:01.317 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 10:55:45 -0400 (0:00:00.297) 0:09:01.614 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 10:55:45 -0400 (0:00:00.264) 0:09:01.879 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 10:55:45 -0400 (0:00:00.318) 0:09:02.197 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 10:55:46 -0400 (0:00:00.253) 0:09:02.451 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 10:55:46 -0400 (0:00:00.286) 0:09:02.737 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 10:55:46 -0400 (0:00:00.316) 0:09:03.054 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 10:55:46 -0400 (0:00:00.286) 0:09:03.340 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 10:55:47 -0400 (0:00:00.317) 0:09:03.657 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 10:55:47 -0400 (0:00:00.214) 0:09:03.871 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 10:55:47 -0400 (0:00:00.258) 0:09:04.129 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 10:55:47 -0400 (0:00:00.207) 0:09:04.337 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 10:55:48 -0400 (0:00:00.179) 0:09:04.516 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 10:55:48 -0400 (0:00:00.273) 0:09:04.789 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 10:55:48 -0400 (0:00:00.287) 0:09:05.076 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 10:55:48 -0400 (0:00:00.310) 0:09:05.387 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 10:55:49 -0400 (0:00:00.311) 0:09:05.698 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 10:55:49 -0400 (0:00:00.326) 0:09:06.025 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 10:55:49 -0400 (0:00:00.163) 0:09:06.188 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 10:55:50 -0400 (0:00:00.278) 0:09:06.467 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 10:55:50 -0400 (0:00:00.217) 0:09:06.685 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 10:55:50 -0400 (0:00:00.284) 0:09:06.970 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 10:55:50 -0400 (0:00:00.252) 0:09:07.222 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 10:55:51 -0400 (0:00:00.286) 0:09:07.509 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 10:55:51 -0400 (0:00:00.287) 0:09:07.797 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 10:55:51 -0400 (0:00:00.247) 0:09:08.044 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 10:55:51 -0400 (0:00:00.204) 0:09:08.248 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 10:55:52 -0400 (0:00:00.250) 0:09:08.499 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 10:55:52 -0400 (0:00:00.269) 0:09:08.769 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 10:55:52 -0400 (0:00:00.237) 0:09:09.006 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 10:55:52 -0400 (0:00:00.376) 0:09:09.383 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 10:55:53 -0400 (0:00:00.327) 0:09:09.711 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 10:55:53 -0400 (0:00:00.311) 0:09:10.022 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 10:55:53 -0400 (0:00:00.267) 0:09:10.289 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 10:55:54 -0400 (0:00:00.310) 0:09:10.600 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 10:55:54 -0400 (0:00:00.237) 0:09:10.837 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 10:55:54 -0400 (0:00:00.228) 0:09:11.066 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 10:55:54 -0400 (0:00:00.217) 0:09:11.283 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 10:55:55 -0400 (0:00:00.313) 0:09:11.597 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 10:55:55 -0400 (0:00:00.342) 0:09:11.939 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 10:55:55 -0400 (0:00:00.208) 0:09:12.148 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 10:55:56 -0400 (0:00:00.722) 0:09:12.870 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 10:55:56 -0400 (0:00:00.223) 0:09:13.093 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 10:55:56 -0400 (0:00:00.317) 0:09:13.411 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 10:55:57 -0400 (0:00:00.361) 0:09:13.772 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 10:55:57 -0400 (0:00:00.245) 0:09:14.017 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 10:55:57 -0400 (0:00:00.394) 0:09:14.412 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 10:55:58 -0400 (0:00:00.298) 0:09:14.711 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 10:55:58 -0400 (0:00:00.299) 0:09:15.011 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 10:55:58 -0400 (0:00:00.165) 0:09:15.176 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 10:55:58 -0400 (0:00:00.193) 0:09:15.370 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 10:55:59 -0400 (0:00:00.199) 0:09:15.569 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:239 Tuesday 13 May 2025 10:56:00 -0400 (0:00:01.742) 0:09:17.311 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:56:01 -0400 (0:00:00.534) 0:09:17.845 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:56:01 -0400 (0:00:00.195) 0:09:18.041 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:56:01 -0400 (0:00:00.286) 0:09:18.327 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:56:02 -0400 (0:00:00.303) 0:09:18.630 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:56:02 -0400 (0:00:00.323) 0:09:18.954 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:56:03 -0400 (0:00:00.583) 0:09:19.537 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:56:03 -0400 (0:00:00.188) 0:09:19.726 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:56:03 -0400 (0:00:00.259) 0:09:19.985 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:56:03 -0400 (0:00:00.174) 0:09:20.159 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:56:04 -0400 (0:00:00.305) 0:09:20.465 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:56:04 -0400 (0:00:00.617) 0:09:21.083 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:56:09 -0400 (0:00:04.909) 0:09:25.992 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:56:09 -0400 (0:00:00.343) 0:09:26.336 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:56:10 -0400 (0:00:00.303) 0:09:26.639 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:56:15 -0400 (0:00:05.604) 0:09:32.243 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:56:16 -0400 (0:00:00.460) 0:09:32.704 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:56:16 -0400 (0:00:00.187) 0:09:32.891 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:56:16 -0400 (0:00:00.192) 0:09:33.084 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:56:17 -0400 (0:00:00.635) 0:09:33.719 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:56:22 -0400 (0:00:04.887) 0:09:38.607 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service": { "name": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service": { "name": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:56:24 -0400 (0:00:02.655) 0:09:41.263 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:56:25 -0400 (0:00:00.254) 0:09:41.518 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2de455c4cc\x2d634c\x2d45f5\x2db789\x2d65424c827f52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "name": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e455c4cc-634c-45f5-b789-65424c827f52", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e455c4cc-634c-45f5-b789-65424c827f52 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e455c4cc-634c-45f5-b789-65424c827f52 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 10:54:56 EDT", "StateChangeTimestampMonotonic": "2254252281", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d634c\x2d45f5\x2db789\x2d65424c827f52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "name": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:56:28 -0400 (0:00:03.653) 0:09:45.171 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-5c6835a5-2263-482b-9ceb-7d65172113bc' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:56:34 -0400 (0:00:05.719) 0:09:50.890 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-5c6835a5-2263-482b-9ceb-7d65172113bc' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:56:34 -0400 (0:00:00.232) 0:09:51.123 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2de455c4cc\x2d634c\x2d45f5\x2db789\x2d65424c827f52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "name": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de455c4cc\\x2d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d634c\x2d45f5\x2db789\x2d65424c827f52.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "name": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d634c\\x2d45f5\\x2db789\\x2d65424c827f52.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:56:38 -0400 (0:00:03.791) 0:09:54.914 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:56:38 -0400 (0:00:00.257) 0:09:55.172 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:56:39 -0400 (0:00:00.456) 0:09:55.628 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 10:56:39 -0400 (0:00:00.354) 0:09:55.983 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148160.5464675, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148160.5464675, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747148160.5464675, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "245627441", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 10:56:41 -0400 (0:00:01.663) 0:09:57.647 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:263 Tuesday 13 May 2025 10:56:41 -0400 (0:00:00.170) 0:09:57.817 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:56:42 -0400 (0:00:00.912) 0:09:58.730 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:56:42 -0400 (0:00:00.348) 0:09:59.079 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:56:43 -0400 (0:00:00.893) 0:09:59.972 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:56:44 -0400 (0:00:00.660) 0:10:00.632 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:56:44 -0400 (0:00:00.287) 0:10:00.919 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:56:44 -0400 (0:00:00.353) 0:10:01.273 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:56:45 -0400 (0:00:00.220) 0:10:01.494 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:56:45 -0400 (0:00:00.235) 0:10:01.729 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:56:45 -0400 (0:00:00.655) 0:10:02.384 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:56:50 -0400 (0:00:04.473) 0:10:06.858 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:56:50 -0400 (0:00:00.316) 0:10:07.174 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:56:51 -0400 (0:00:00.278) 0:10:07.453 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:56:56 -0400 (0:00:05.677) 0:10:13.130 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:56:57 -0400 (0:00:00.461) 0:10:13.592 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:56:57 -0400 (0:00:00.247) 0:10:13.839 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:56:57 -0400 (0:00:00.283) 0:10:14.122 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:56:57 -0400 (0:00:00.139) 0:10:14.261 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:57:02 -0400 (0:00:05.003) 0:10:19.265 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service": { "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service": { "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:57:05 -0400 (0:00:02.962) 0:10:22.227 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:57:06 -0400 (0:00:00.362) 0:10:22.590 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5c6835a5\x2d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5c6835a5-2263-482b-9ceb-7d65172113bc /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5c6835a5-2263-482b-9ceb-7d65172113bc ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 10:56:28 EDT", "StateChangeTimestampMonotonic": "2346443829", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:57:09 -0400 (0:00:03.553) 0:10:26.144 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:57:15 -0400 (0:00:05.949) 0:10:32.093 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:57:15 -0400 (0:00:00.149) 0:10:32.243 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148093.9843664, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ca9de83e9846f9f0429f2251d39d9bd4c1f54766", "ctime": 1747148093.9813664, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148093.9813664, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:57:17 -0400 (0:00:01.346) 0:10:33.589 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:57:18 -0400 (0:00:01.791) 0:10:35.380 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5c6835a5\x2d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 10:56:28 EDT", "StateChangeTimestampMonotonic": "2346443829", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:57:22 -0400 (0:00:03.791) 0:10:39.172 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:57:23 -0400 (0:00:00.469) 0:10:39.642 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:57:23 -0400 (0:00:00.456) 0:10:40.098 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:57:23 -0400 (0:00:00.194) 0:10:40.292 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5c6835a5-2263-482b-9ceb-7d65172113bc" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:57:25 -0400 (0:00:01.762) 0:10:42.055 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:57:27 -0400 (0:00:01.921) 0:10:43.977 *********** changed: [managed-node8] => (item={'src': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:57:29 -0400 (0:00:01.730) 0:10:45.707 *********** skipping: [managed-node8] => (item={'src': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 10:57:29 -0400 (0:00:00.268) 0:10:45.976 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 10:57:31 -0400 (0:00:01.882) 0:10:47.858 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148107.1683865, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "967b0476a164d3a8c82d07a3591fccb9d5d599a7", "ctime": 1747148100.5043762, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 329252996, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747148100.5043762, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "2049199830", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 10:57:33 -0400 (0:00:01.727) 0:10:49.586 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-5c6835a5-2263-482b-9ceb-7d65172113bc', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 10:57:34 -0400 (0:00:01.600) 0:10:51.187 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Tuesday 13 May 2025 10:57:36 -0400 (0:00:01.916) 0:10:53.104 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 10:57:37 -0400 (0:00:00.799) 0:10:53.903 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 10:57:37 -0400 (0:00:00.280) 0:10:54.184 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 10:57:37 -0400 (0:00:00.238) 0:10:54.423 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "efe4912e-f6a0-48ab-ac73-eef62dce49b8" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 10:57:39 -0400 (0:00:01.484) 0:10:55.907 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002634", "end": "2025-05-13 10:57:40.689467", "rc": 0, "start": "2025-05-13 10:57:40.686833" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 10:57:40 -0400 (0:00:01.477) 0:10:57.384 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002648", "end": "2025-05-13 10:57:41.800046", "failed_when_result": false, "rc": 0, "start": "2025-05-13 10:57:41.797398" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 10:57:41 -0400 (0:00:01.041) 0:10:58.426 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 10:57:42 -0400 (0:00:00.294) 0:10:58.720 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 10:57:42 -0400 (0:00:00.163) 0:10:58.884 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 10:57:42 -0400 (0:00:00.200) 0:10:59.084 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 10:57:42 -0400 (0:00:00.256) 0:10:59.341 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 10:57:43 -0400 (0:00:00.394) 0:10:59.735 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 10:57:43 -0400 (0:00:00.437) 0:11:00.173 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 10:57:43 -0400 (0:00:00.231) 0:11:00.407 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 10:57:44 -0400 (0:00:00.243) 0:11:00.650 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 10:57:44 -0400 (0:00:00.247) 0:11:00.898 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 10:57:44 -0400 (0:00:00.269) 0:11:01.168 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 10:57:44 -0400 (0:00:00.134) 0:11:01.302 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 10:57:45 -0400 (0:00:00.176) 0:11:01.479 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 10:57:45 -0400 (0:00:00.224) 0:11:01.703 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 10:57:45 -0400 (0:00:00.188) 0:11:01.892 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 10:57:46 -0400 (0:00:01.206) 0:11:03.098 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 10:57:46 -0400 (0:00:00.194) 0:11:03.293 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 10:57:47 -0400 (0:00:00.409) 0:11:03.703 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 10:57:47 -0400 (0:00:00.173) 0:11:03.877 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 10:57:47 -0400 (0:00:00.246) 0:11:04.124 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 10:57:47 -0400 (0:00:00.090) 0:11:04.214 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 10:57:47 -0400 (0:00:00.066) 0:11:04.280 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 10:57:48 -0400 (0:00:00.214) 0:11:04.495 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 10:57:48 -0400 (0:00:00.182) 0:11:04.678 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 10:57:48 -0400 (0:00:00.184) 0:11:04.863 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 10:57:48 -0400 (0:00:00.305) 0:11:05.168 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 10:57:49 -0400 (0:00:00.267) 0:11:05.436 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 10:57:49 -0400 (0:00:00.232) 0:11:05.668 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 10:57:49 -0400 (0:00:00.203) 0:11:05.871 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 10:57:49 -0400 (0:00:00.407) 0:11:06.279 *********** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 10:57:50 -0400 (0:00:00.264) 0:11:06.543 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 10:57:50 -0400 (0:00:00.528) 0:11:07.072 *********** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 10:57:50 -0400 (0:00:00.322) 0:11:07.394 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 10:57:51 -0400 (0:00:00.638) 0:11:08.032 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 10:57:51 -0400 (0:00:00.270) 0:11:08.303 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 10:57:51 -0400 (0:00:00.112) 0:11:08.415 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 10:57:52 -0400 (0:00:00.279) 0:11:08.695 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 10:57:52 -0400 (0:00:00.245) 0:11:08.941 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 10:57:53 -0400 (0:00:00.751) 0:11:09.692 *********** skipping: [managed-node8] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 10:57:53 -0400 (0:00:00.328) 0:11:10.020 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 10:57:54 -0400 (0:00:00.581) 0:11:10.601 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 10:57:54 -0400 (0:00:00.312) 0:11:10.914 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 10:57:54 -0400 (0:00:00.291) 0:11:11.205 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 10:57:55 -0400 (0:00:00.252) 0:11:11.458 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 10:57:55 -0400 (0:00:00.318) 0:11:11.776 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 10:57:55 -0400 (0:00:00.174) 0:11:11.951 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 10:57:55 -0400 (0:00:00.355) 0:11:12.307 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 10:57:56 -0400 (0:00:00.246) 0:11:12.554 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 10:57:56 -0400 (0:00:00.533) 0:11:13.088 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 10:57:56 -0400 (0:00:00.345) 0:11:13.433 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 10:57:57 -0400 (0:00:00.213) 0:11:13.647 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 10:57:57 -0400 (0:00:00.741) 0:11:14.388 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 10:57:58 -0400 (0:00:00.235) 0:11:14.624 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 10:57:58 -0400 (0:00:00.319) 0:11:14.943 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 10:57:58 -0400 (0:00:00.142) 0:11:15.086 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 10:57:58 -0400 (0:00:00.250) 0:11:15.336 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 10:57:59 -0400 (0:00:00.287) 0:11:15.624 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 10:57:59 -0400 (0:00:00.224) 0:11:15.848 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 10:57:59 -0400 (0:00:00.244) 0:11:16.093 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 10:57:59 -0400 (0:00:00.201) 0:11:16.295 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 10:58:00 -0400 (0:00:00.209) 0:11:16.505 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 10:58:00 -0400 (0:00:00.229) 0:11:16.734 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 10:58:00 -0400 (0:00:00.125) 0:11:16.859 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 10:58:00 -0400 (0:00:00.238) 0:11:17.097 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 10:58:00 -0400 (0:00:00.174) 0:11:17.272 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 10:58:01 -0400 (0:00:00.180) 0:11:17.452 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 10:58:01 -0400 (0:00:00.307) 0:11:17.759 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 10:58:01 -0400 (0:00:00.289) 0:11:18.049 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 10:58:01 -0400 (0:00:00.296) 0:11:18.345 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 10:58:02 -0400 (0:00:00.289) 0:11:18.634 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 10:58:02 -0400 (0:00:00.314) 0:11:18.949 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148235.3765817, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148235.3765817, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 169792, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747148235.3765817, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 10:58:04 -0400 (0:00:01.491) 0:11:20.440 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 10:58:04 -0400 (0:00:00.301) 0:11:20.742 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 10:58:04 -0400 (0:00:00.271) 0:11:21.014 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 10:58:04 -0400 (0:00:00.226) 0:11:21.240 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 10:58:05 -0400 (0:00:00.365) 0:11:21.606 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 10:58:05 -0400 (0:00:00.300) 0:11:21.906 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 10:58:05 -0400 (0:00:00.346) 0:11:22.253 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 10:58:06 -0400 (0:00:00.295) 0:11:22.549 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 10:58:11 -0400 (0:00:04.960) 0:11:27.509 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 10:58:11 -0400 (0:00:00.294) 0:11:27.803 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 10:58:11 -0400 (0:00:00.338) 0:11:28.141 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 10:58:12 -0400 (0:00:00.498) 0:11:28.640 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 10:58:12 -0400 (0:00:00.244) 0:11:28.885 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 10:58:12 -0400 (0:00:00.250) 0:11:29.135 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 10:58:12 -0400 (0:00:00.254) 0:11:29.389 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 10:58:13 -0400 (0:00:00.237) 0:11:29.627 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 10:58:13 -0400 (0:00:00.177) 0:11:29.804 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 10:58:13 -0400 (0:00:00.380) 0:11:30.185 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 10:58:14 -0400 (0:00:00.319) 0:11:30.504 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 10:58:14 -0400 (0:00:00.284) 0:11:30.788 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 10:58:14 -0400 (0:00:00.311) 0:11:31.099 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 10:58:14 -0400 (0:00:00.319) 0:11:31.419 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 10:58:15 -0400 (0:00:00.306) 0:11:31.726 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 10:58:15 -0400 (0:00:00.285) 0:11:32.011 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 10:58:16 -0400 (0:00:00.941) 0:11:32.953 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 10:58:16 -0400 (0:00:00.238) 0:11:33.192 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 10:58:17 -0400 (0:00:00.272) 0:11:33.465 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 10:58:17 -0400 (0:00:00.248) 0:11:33.713 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 10:58:17 -0400 (0:00:00.273) 0:11:33.987 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 10:58:17 -0400 (0:00:00.239) 0:11:34.226 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 10:58:18 -0400 (0:00:00.267) 0:11:34.493 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 10:58:18 -0400 (0:00:00.225) 0:11:34.719 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 10:58:18 -0400 (0:00:00.282) 0:11:35.002 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 10:58:18 -0400 (0:00:00.184) 0:11:35.186 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 10:58:19 -0400 (0:00:00.263) 0:11:35.449 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 10:58:19 -0400 (0:00:00.209) 0:11:35.659 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 10:58:19 -0400 (0:00:00.172) 0:11:35.831 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 10:58:19 -0400 (0:00:00.163) 0:11:35.994 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 10:58:19 -0400 (0:00:00.207) 0:11:36.202 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 10:58:20 -0400 (0:00:00.260) 0:11:36.463 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 10:58:20 -0400 (0:00:00.232) 0:11:36.696 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 10:58:20 -0400 (0:00:00.217) 0:11:36.913 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 10:58:20 -0400 (0:00:00.191) 0:11:37.105 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 10:58:20 -0400 (0:00:00.208) 0:11:37.314 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 10:58:21 -0400 (0:00:00.230) 0:11:37.544 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 10:58:21 -0400 (0:00:00.192) 0:11:37.736 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 10:58:21 -0400 (0:00:00.139) 0:11:37.876 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 10:58:21 -0400 (0:00:00.197) 0:11:38.074 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 10:58:21 -0400 (0:00:00.244) 0:11:38.318 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 10:58:22 -0400 (0:00:00.293) 0:11:38.612 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 10:58:22 -0400 (0:00:00.259) 0:11:38.872 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 10:58:22 -0400 (0:00:00.217) 0:11:39.089 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 10:58:22 -0400 (0:00:00.217) 0:11:39.307 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 10:58:23 -0400 (0:00:00.193) 0:11:39.501 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 10:58:23 -0400 (0:00:00.168) 0:11:39.669 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 10:58:23 -0400 (0:00:00.184) 0:11:39.854 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 10:58:23 -0400 (0:00:00.191) 0:11:40.046 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 10:58:23 -0400 (0:00:00.203) 0:11:40.249 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 10:58:24 -0400 (0:00:00.310) 0:11:40.560 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 10:58:24 -0400 (0:00:00.293) 0:11:40.854 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 10:58:24 -0400 (0:00:00.252) 0:11:41.106 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 10:58:24 -0400 (0:00:00.266) 0:11:41.373 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 10:58:25 -0400 (0:00:00.323) 0:11:41.697 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 10:58:25 -0400 (0:00:00.363) 0:11:42.060 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 10:58:25 -0400 (0:00:00.304) 0:11:42.365 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 10:58:26 -0400 (0:00:00.300) 0:11:42.665 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 10:58:26 -0400 (0:00:00.289) 0:11:42.954 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 10:58:26 -0400 (0:00:00.291) 0:11:43.245 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 10:58:27 -0400 (0:00:00.200) 0:11:43.445 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 10:58:27 -0400 (0:00:00.189) 0:11:43.635 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:286 Tuesday 13 May 2025 10:58:28 -0400 (0:00:01.649) 0:11:45.284 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 10:58:29 -0400 (0:00:00.631) 0:11:45.915 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 10:58:29 -0400 (0:00:00.240) 0:11:46.156 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:58:30 -0400 (0:00:00.344) 0:11:46.501 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:58:30 -0400 (0:00:00.863) 0:11:47.364 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:58:31 -0400 (0:00:00.203) 0:11:47.568 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:58:31 -0400 (0:00:00.449) 0:11:48.018 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:58:31 -0400 (0:00:00.175) 0:11:48.193 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:58:31 -0400 (0:00:00.148) 0:11:48.342 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:58:31 -0400 (0:00:00.076) 0:11:48.418 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:58:32 -0400 (0:00:00.168) 0:11:48.587 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:58:32 -0400 (0:00:00.340) 0:11:48.927 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:58:37 -0400 (0:00:04.593) 0:11:53.521 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:58:37 -0400 (0:00:00.326) 0:11:53.847 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:58:37 -0400 (0:00:00.303) 0:11:54.151 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:58:43 -0400 (0:00:05.800) 0:11:59.952 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:58:44 -0400 (0:00:00.521) 0:12:00.473 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:58:44 -0400 (0:00:00.322) 0:12:00.796 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:58:44 -0400 (0:00:00.271) 0:12:01.068 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:58:44 -0400 (0:00:00.200) 0:12:01.268 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:58:49 -0400 (0:00:04.692) 0:12:05.961 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service": { "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service": { "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:58:52 -0400 (0:00:02.900) 0:12:08.862 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:58:52 -0400 (0:00:00.342) 0:12:09.204 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5c6835a5\x2d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5c6835a5-2263-482b-9ceb-7d65172113bc", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5c6835a5-2263-482b-9ceb-7d65172113bc /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5c6835a5-2263-482b-9ceb-7d65172113bc ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 10:56:28 EDT", "StateChangeTimestampMonotonic": "2346443829", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:58:56 -0400 (0:00:03.546) 0:12:12.751 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 10:59:02 -0400 (0:00:05.771) 0:12:18.522 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:59:02 -0400 (0:00:00.241) 0:12:18.763 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d5c6835a5\x2d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5c6835a5\\x2d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d2263\x2d482b\x2d9ceb\x2d7d65172113bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "name": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2263\\x2d482b\\x2d9ceb\\x2d7d65172113bc.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 10:59:05 -0400 (0:00:03.409) 0:12:22.173 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 10:59:06 -0400 (0:00:00.665) 0:12:22.838 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 10:59:06 -0400 (0:00:00.188) 0:12:23.027 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 10:59:06 -0400 (0:00:00.279) 0:12:23.306 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148308.5486932, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148308.5486932, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747148308.5486932, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "48093191", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 10:59:07 -0400 (0:00:01.068) 0:12:24.375 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Tuesday 13 May 2025 10:59:08 -0400 (0:00:00.279) 0:12:24.655 *********** ok: [managed-node8] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testhe62dw08lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 Tuesday 13 May 2025 10:59:10 -0400 (0:00:02.357) 0:12:27.013 *********** ok: [managed-node8] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testhe62dw08lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1747148350.8877857-162787-74493583246685/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:326 Tuesday 13 May 2025 10:59:14 -0400 (0:00:03.538) 0:12:30.551 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 10:59:14 -0400 (0:00:00.340) 0:12:30.892 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 10:59:14 -0400 (0:00:00.340) 0:12:31.233 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 10:59:14 -0400 (0:00:00.153) 0:12:31.387 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 10:59:15 -0400 (0:00:00.441) 0:12:31.828 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 10:59:15 -0400 (0:00:00.160) 0:12:31.988 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 10:59:15 -0400 (0:00:00.206) 0:12:32.195 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 10:59:15 -0400 (0:00:00.219) 0:12:32.415 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 10:59:16 -0400 (0:00:00.203) 0:12:32.618 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 10:59:16 -0400 (0:00:00.431) 0:12:33.049 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 10:59:21 -0400 (0:00:04.869) 0:12:37.919 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testhe62dw08lukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 10:59:21 -0400 (0:00:00.237) 0:12:38.157 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 10:59:22 -0400 (0:00:00.329) 0:12:38.487 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 10:59:27 -0400 (0:00:05.535) 0:12:44.023 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 10:59:27 -0400 (0:00:00.405) 0:12:44.428 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 10:59:28 -0400 (0:00:00.182) 0:12:44.611 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 10:59:28 -0400 (0:00:00.264) 0:12:44.876 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 10:59:28 -0400 (0:00:00.175) 0:12:45.051 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 10:59:33 -0400 (0:00:04.795) 0:12:49.846 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 10:59:36 -0400 (0:00:03.008) 0:12:52.855 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 10:59:36 -0400 (0:00:00.344) 0:12:53.199 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 10:59:36 -0400 (0:00:00.230) 0:12:53.430 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 10:59:50 -0400 (0:00:13.738) 0:13:07.168 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 10:59:50 -0400 (0:00:00.260) 0:13:07.428 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148248.9646025, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "22c8ae061d3e56edd99c118170d332975635106e", "ctime": 1747148248.9616024, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148248.9616024, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 10:59:52 -0400 (0:00:01.518) 0:13:08.947 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 10:59:53 -0400 (0:00:01.433) 0:13:10.381 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 10:59:54 -0400 (0:00:00.217) 0:13:10.598 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 10:59:54 -0400 (0:00:00.367) 0:13:10.965 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 10:59:54 -0400 (0:00:00.303) 0:13:11.269 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 10:59:55 -0400 (0:00:00.254) 0:13:11.524 *********** changed: [managed-node8] => (item={'src': 'UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=efe4912e-f6a0-48ab-ac73-eef62dce49b8" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 10:59:56 -0400 (0:00:01.443) 0:13:12.967 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 10:59:58 -0400 (0:00:01.733) 0:13:14.701 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 10:59:59 -0400 (0:00:01.682) 0:13:16.384 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:00:00 -0400 (0:00:00.352) 0:13:16.736 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:00:02 -0400 (0:00:01.797) 0:13:18.534 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148261.798622, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148254.4156108, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 484442250, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1747148254.4146106, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2151187438", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:00:03 -0400 (0:00:01.533) 0:13:20.068 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-92d5456b-8615-442e-a019-3986f12860bb', 'password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:00:05 -0400 (0:00:01.532) 0:13:21.600 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:343 Tuesday 13 May 2025 11:00:06 -0400 (0:00:01.575) 0:13:23.176 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:00:07 -0400 (0:00:00.286) 0:13:23.463 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:00:07 -0400 (0:00:00.245) 0:13:23.708 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:00:07 -0400 (0:00:00.281) 0:13:23.990 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "size": "10G", "type": "crypt", "uuid": "7d6090ec-88b7-40ee-81b5-9e36ea5658f4" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "92d5456b-8615-442e-a019-3986f12860bb" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:00:09 -0400 (0:00:01.581) 0:13:25.571 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002665", "end": "2025-05-13 11:00:10.489633", "rc": 0, "start": "2025-05-13 11:00:10.486968" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:00:10 -0400 (0:00:01.680) 0:13:27.252 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002673", "end": "2025-05-13 11:00:11.923687", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:00:11.921014" } STDOUT: luks-92d5456b-8615-442e-a019-3986f12860bb /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:00:12 -0400 (0:00:01.428) 0:13:28.680 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 11:00:12 -0400 (0:00:00.568) 0:13:29.248 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 11:00:12 -0400 (0:00:00.183) 0:13:29.431 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 11:00:13 -0400 (0:00:00.311) 0:13:29.743 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 11:00:13 -0400 (0:00:00.199) 0:13:29.942 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 11:00:14 -0400 (0:00:00.497) 0:13:30.440 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 11:00:14 -0400 (0:00:00.269) 0:13:30.710 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 11:00:14 -0400 (0:00:00.191) 0:13:30.902 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 11:00:14 -0400 (0:00:00.237) 0:13:31.139 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 11:00:14 -0400 (0:00:00.181) 0:13:31.321 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 11:00:15 -0400 (0:00:00.205) 0:13:31.526 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 11:00:15 -0400 (0:00:00.303) 0:13:31.830 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 11:00:15 -0400 (0:00:00.247) 0:13:32.078 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 11:00:15 -0400 (0:00:00.292) 0:13:32.370 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 11:00:16 -0400 (0:00:00.219) 0:13:32.589 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 11:00:17 -0400 (0:00:01.587) 0:13:34.177 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 11:00:18 -0400 (0:00:00.276) 0:13:34.453 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 11:00:19 -0400 (0:00:01.070) 0:13:35.524 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 11:00:19 -0400 (0:00:00.217) 0:13:35.742 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 11:00:19 -0400 (0:00:00.269) 0:13:36.011 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 11:00:19 -0400 (0:00:00.304) 0:13:36.316 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 11:00:20 -0400 (0:00:00.277) 0:13:36.593 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 11:00:20 -0400 (0:00:00.271) 0:13:36.865 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 11:00:20 -0400 (0:00:00.340) 0:13:37.205 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 11:00:21 -0400 (0:00:00.324) 0:13:37.529 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 11:00:21 -0400 (0:00:00.199) 0:13:37.729 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 11:00:21 -0400 (0:00:00.254) 0:13:37.983 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 11:00:21 -0400 (0:00:00.302) 0:13:38.285 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 11:00:22 -0400 (0:00:00.232) 0:13:38.518 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 11:00:22 -0400 (0:00:00.475) 0:13:38.993 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 11:00:22 -0400 (0:00:00.389) 0:13:39.382 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 11:00:23 -0400 (0:00:00.506) 0:13:39.889 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 11:00:23 -0400 (0:00:00.277) 0:13:40.166 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 11:00:24 -0400 (0:00:00.414) 0:13:40.581 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 11:00:24 -0400 (0:00:00.253) 0:13:40.835 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 11:00:24 -0400 (0:00:00.166) 0:13:41.002 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 11:00:24 -0400 (0:00:00.186) 0:13:41.188 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 11:00:24 -0400 (0:00:00.148) 0:13:41.337 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 11:00:25 -0400 (0:00:00.481) 0:13:41.819 *********** skipping: [managed-node8] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 11:00:25 -0400 (0:00:00.314) 0:13:42.134 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 11:00:26 -0400 (0:00:00.754) 0:13:42.888 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 11:00:26 -0400 (0:00:00.346) 0:13:43.234 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 11:00:27 -0400 (0:00:00.284) 0:13:43.519 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 11:00:27 -0400 (0:00:00.283) 0:13:43.802 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 11:00:27 -0400 (0:00:00.282) 0:13:44.085 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 11:00:27 -0400 (0:00:00.288) 0:13:44.374 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 11:00:28 -0400 (0:00:00.220) 0:13:44.595 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 11:00:28 -0400 (0:00:00.279) 0:13:44.874 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 11:00:28 -0400 (0:00:00.231) 0:13:45.106 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:00:29 -0400 (0:00:00.472) 0:13:45.579 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:00:29 -0400 (0:00:00.285) 0:13:45.864 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:00:30 -0400 (0:00:01.120) 0:13:46.985 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:00:30 -0400 (0:00:00.277) 0:13:47.262 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:00:31 -0400 (0:00:00.769) 0:13:48.032 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:00:31 -0400 (0:00:00.269) 0:13:48.301 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:00:32 -0400 (0:00:00.405) 0:13:48.706 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:00:32 -0400 (0:00:00.304) 0:13:49.011 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:00:32 -0400 (0:00:00.306) 0:13:49.317 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:00:33 -0400 (0:00:00.345) 0:13:49.663 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:00:33 -0400 (0:00:00.248) 0:13:49.911 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:00:33 -0400 (0:00:00.322) 0:13:50.234 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:00:34 -0400 (0:00:00.288) 0:13:50.523 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:00:34 -0400 (0:00:00.333) 0:13:50.856 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:00:34 -0400 (0:00:00.425) 0:13:51.282 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:00:35 -0400 (0:00:00.341) 0:13:51.624 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:00:35 -0400 (0:00:00.262) 0:13:51.887 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:00:35 -0400 (0:00:00.397) 0:13:52.284 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:00:36 -0400 (0:00:00.337) 0:13:52.621 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:00:36 -0400 (0:00:00.174) 0:13:52.795 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:00:36 -0400 (0:00:00.298) 0:13:53.094 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:00:37 -0400 (0:00:00.383) 0:13:53.478 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148390.261818, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148390.261818, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 201330, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747148390.261818, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:00:38 -0400 (0:00:01.712) 0:13:55.190 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:00:39 -0400 (0:00:00.296) 0:13:55.486 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:00:39 -0400 (0:00:00.241) 0:13:55.728 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:00:39 -0400 (0:00:00.311) 0:13:56.040 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:00:39 -0400 (0:00:00.243) 0:13:56.283 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:00:40 -0400 (0:00:00.252) 0:13:56.536 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:00:40 -0400 (0:00:00.193) 0:13:56.729 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148390.4138181, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148390.4138181, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 201429, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148390.4138181, "nlink": 1, "path": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:00:41 -0400 (0:00:01.417) 0:13:58.147 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:00:46 -0400 (0:00:04.846) 0:14:02.994 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.011251", "end": "2025-05-13 11:00:47.742162", "rc": 0, "start": "2025-05-13 11:00:47.730911" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 92d5456b-8615-442e-a019-3986f12860bb Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 926692 Threads: 2 Salt: f5 af ec e5 fb 3e a6 8b 1e 1e 73 56 22 ed 78 59 b6 8a ba e9 16 f8 4a ce 7b 1a 54 48 93 3d b9 00 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 4b fa 28 4e 1c 2b c9 db 7d d6 ad 29 a5 3e 89 2f f0 4c bf 12 f5 d4 0d 36 4b 8b 51 41 c9 08 c3 60 Digest: da bb 2f f1 8c 7b 94 60 0e 42 de 10 6c 4f 75 30 f4 f1 e8 7a fc d4 7f 64 ed 2f 78 b8 9c fb eb 69 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:00:48 -0400 (0:00:01.455) 0:14:04.450 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:00:48 -0400 (0:00:00.276) 0:14:04.726 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:00:48 -0400 (0:00:00.430) 0:14:05.157 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:00:49 -0400 (0:00:00.345) 0:14:05.502 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:00:49 -0400 (0:00:00.215) 0:14:05.718 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:00:49 -0400 (0:00:00.169) 0:14:05.887 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:00:49 -0400 (0:00:00.193) 0:14:06.081 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:00:49 -0400 (0:00:00.183) 0:14:06.265 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-92d5456b-8615-442e-a019-3986f12860bb /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:00:50 -0400 (0:00:00.274) 0:14:06.539 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:00:50 -0400 (0:00:00.233) 0:14:06.772 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:00:50 -0400 (0:00:00.193) 0:14:06.966 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:00:50 -0400 (0:00:00.297) 0:14:07.264 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:00:51 -0400 (0:00:00.258) 0:14:07.522 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:00:51 -0400 (0:00:00.195) 0:14:07.718 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:00:51 -0400 (0:00:00.280) 0:14:07.998 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:00:51 -0400 (0:00:00.238) 0:14:08.237 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:00:52 -0400 (0:00:00.308) 0:14:08.545 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:00:52 -0400 (0:00:00.197) 0:14:08.743 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:00:52 -0400 (0:00:00.313) 0:14:09.056 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:00:52 -0400 (0:00:00.210) 0:14:09.267 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:00:53 -0400 (0:00:00.311) 0:14:09.579 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:00:53 -0400 (0:00:00.255) 0:14:09.834 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:00:53 -0400 (0:00:00.325) 0:14:10.160 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:00:53 -0400 (0:00:00.211) 0:14:10.371 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:00:54 -0400 (0:00:00.271) 0:14:10.643 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:00:54 -0400 (0:00:00.256) 0:14:10.899 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:00:54 -0400 (0:00:00.280) 0:14:11.180 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:00:55 -0400 (0:00:00.257) 0:14:11.438 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:00:55 -0400 (0:00:00.247) 0:14:11.685 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:00:55 -0400 (0:00:00.279) 0:14:11.964 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:00:55 -0400 (0:00:00.374) 0:14:12.339 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:00:56 -0400 (0:00:00.243) 0:14:12.582 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:00:56 -0400 (0:00:00.225) 0:14:12.808 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:00:56 -0400 (0:00:00.240) 0:14:13.049 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:00:56 -0400 (0:00:00.295) 0:14:13.344 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:00:57 -0400 (0:00:00.244) 0:14:13.589 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:00:57 -0400 (0:00:00.297) 0:14:13.887 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:00:57 -0400 (0:00:00.225) 0:14:14.112 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:00:58 -0400 (0:00:00.799) 0:14:14.912 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:00:58 -0400 (0:00:00.296) 0:14:15.208 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:00:59 -0400 (0:00:00.259) 0:14:15.467 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:00:59 -0400 (0:00:00.351) 0:14:15.819 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:00:59 -0400 (0:00:00.244) 0:14:16.063 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:00:59 -0400 (0:00:00.235) 0:14:16.298 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.204) 0:14:16.503 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.153) 0:14:16.656 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.170) 0:14:16.826 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.186) 0:14:17.012 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.209) 0:14:17.222 *********** ok: [managed-node8] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:01:00 -0400 (0:00:00.207) 0:14:17.429 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:01:01 -0400 (0:00:00.255) 0:14:17.685 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:01:01 -0400 (0:00:00.232) 0:14:17.917 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:01:01 -0400 (0:00:00.337) 0:14:18.255 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:01:02 -0400 (0:00:00.265) 0:14:18.520 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:01:02 -0400 (0:00:00.297) 0:14:18.817 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:01:02 -0400 (0:00:00.215) 0:14:19.032 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:01:02 -0400 (0:00:00.230) 0:14:19.263 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:01:03 -0400 (0:00:00.225) 0:14:19.489 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:01:03 -0400 (0:00:00.164) 0:14:19.653 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:01:03 -0400 (0:00:00.203) 0:14:19.857 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:346 Tuesday 13 May 2025 11:01:03 -0400 (0:00:00.210) 0:14:20.067 *********** ok: [managed-node8] => { "changed": false, "path": "/tmp/storage_testhe62dw08lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:356 Tuesday 13 May 2025 11:01:05 -0400 (0:00:01.489) 0:14:21.556 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 11:01:05 -0400 (0:00:00.452) 0:14:22.008 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 11:01:05 -0400 (0:00:00.266) 0:14:22.275 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:01:06 -0400 (0:00:00.413) 0:14:22.688 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:01:06 -0400 (0:00:00.330) 0:14:23.019 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:01:06 -0400 (0:00:00.292) 0:14:23.312 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:01:07 -0400 (0:00:00.495) 0:14:23.807 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:01:07 -0400 (0:00:00.261) 0:14:24.069 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:01:07 -0400 (0:00:00.202) 0:14:24.272 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:01:07 -0400 (0:00:00.130) 0:14:24.403 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:01:08 -0400 (0:00:00.123) 0:14:24.526 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:01:08 -0400 (0:00:00.273) 0:14:24.799 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:01:12 -0400 (0:00:04.279) 0:14:29.079 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:01:12 -0400 (0:00:00.258) 0:14:29.338 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:01:13 -0400 (0:00:00.258) 0:14:29.596 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:01:18 -0400 (0:00:05.394) 0:14:34.991 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:01:19 -0400 (0:00:00.573) 0:14:35.564 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:01:19 -0400 (0:00:00.257) 0:14:35.822 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:01:19 -0400 (0:00:00.289) 0:14:36.111 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:01:19 -0400 (0:00:00.225) 0:14:36.336 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:01:24 -0400 (0:00:04.652) 0:14:40.989 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:01:28 -0400 (0:00:03.800) 0:14:44.790 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:01:28 -0400 (0:00:00.431) 0:14:45.222 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:01:29 -0400 (0:00:00.712) 0:14:45.934 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 11:01:34 -0400 (0:00:05.365) 0:14:51.300 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:01:35 -0400 (0:00:00.315) 0:14:51.616 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 11:01:35 -0400 (0:00:00.269) 0:14:51.885 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 11:01:35 -0400 (0:00:00.274) 0:14:52.159 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 11:01:36 -0400 (0:00:00.331) 0:14:52.491 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:374 Tuesday 13 May 2025 11:01:36 -0400 (0:00:00.244) 0:14:52.735 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:01:36 -0400 (0:00:00.402) 0:14:53.137 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:01:36 -0400 (0:00:00.263) 0:14:53.400 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:01:37 -0400 (0:00:00.277) 0:14:53.678 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:01:37 -0400 (0:00:00.422) 0:14:54.101 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:01:37 -0400 (0:00:00.197) 0:14:54.299 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:01:38 -0400 (0:00:00.253) 0:14:54.552 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:01:38 -0400 (0:00:00.224) 0:14:54.776 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:01:38 -0400 (0:00:00.221) 0:14:54.997 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:01:39 -0400 (0:00:00.593) 0:14:55.591 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:01:44 -0400 (0:00:04.913) 0:15:00.504 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:01:44 -0400 (0:00:00.322) 0:15:00.827 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:01:44 -0400 (0:00:00.308) 0:15:01.135 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:01:50 -0400 (0:00:05.556) 0:15:06.691 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:01:50 -0400 (0:00:00.443) 0:15:07.135 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:01:50 -0400 (0:00:00.215) 0:15:07.351 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:01:51 -0400 (0:00:00.339) 0:15:07.690 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:01:51 -0400 (0:00:00.230) 0:15:07.920 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:01:56 -0400 (0:00:04.936) 0:15:12.857 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:01:59 -0400 (0:00:02.736) 0:15:15.593 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:02:00 -0400 (0:00:00.856) 0:15:16.449 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:02:00 -0400 (0:00:00.175) 0:15:16.624 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 11:02:12 -0400 (0:00:12.392) 0:15:29.016 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 11:02:12 -0400 (0:00:00.318) 0:15:29.335 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148399.6068323, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "24959d9097d906d02350a2fc8887141d30acdf45", "ctime": 1747148399.602832, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148399.602832, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 11:02:14 -0400 (0:00:01.514) 0:15:30.850 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:02:16 -0400 (0:00:01.667) 0:15:32.518 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 11:02:16 -0400 (0:00:00.166) 0:15:32.684 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 11:02:16 -0400 (0:00:00.272) 0:15:32.956 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 11:02:16 -0400 (0:00:00.321) 0:15:33.277 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 11:02:17 -0400 (0:00:00.253) 0:15:33.531 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-92d5456b-8615-442e-a019-3986f12860bb" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 11:02:18 -0400 (0:00:01.812) 0:15:35.343 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 11:02:20 -0400 (0:00:01.779) 0:15:37.122 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 11:02:22 -0400 (0:00:01.897) 0:15:39.019 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:02:22 -0400 (0:00:00.328) 0:15:39.347 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:02:24 -0400 (0:00:01.880) 0:15:41.227 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148411.922851, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d0d230dca5839c196a95d93936064facfc744002", "ctime": 1747148404.8678403, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 98566409, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747148404.8668404, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "2289819606", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:02:26 -0400 (0:00:01.538) 0:15:42.766 *********** changed: [managed-node8] => (item={'backing_device': '/dev/sda1', 'name': 'luks-92d5456b-8615-442e-a019-3986f12860bb', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-92d5456b-8615-442e-a019-3986f12860bb", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f464edab-7000-480c-9b25-609fde3e3013', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:02:29 -0400 (0:00:02.797) 0:15:45.563 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:393 Tuesday 13 May 2025 11:02:31 -0400 (0:00:02.197) 0:15:47.761 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:02:31 -0400 (0:00:00.479) 0:15:48.240 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:02:32 -0400 (0:00:00.297) 0:15:48.538 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:02:32 -0400 (0:00:00.252) 0:15:48.791 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f464edab-7000-480c-9b25-609fde3e3013" }, "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "size": "4G", "type": "crypt", "uuid": "e631b1fe-c95e-4959-af84-c0ae18278056" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:02:33 -0400 (0:00:01.400) 0:15:50.191 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002549", "end": "2025-05-13 11:02:34.883492", "rc": 0, "start": "2025-05-13 11:02:34.880943" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:02:35 -0400 (0:00:01.454) 0:15:51.645 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002698", "end": "2025-05-13 11:02:36.489683", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:02:36.486985" } STDOUT: luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:02:36 -0400 (0:00:01.579) 0:15:53.225 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 11:02:37 -0400 (0:00:00.383) 0:15:53.609 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 11:02:37 -0400 (0:00:00.192) 0:15:53.801 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022475", "end": "2025-05-13 11:02:38.684473", "rc": 0, "start": "2025-05-13 11:02:38.661998" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 11:02:38 -0400 (0:00:01.593) 0:15:55.394 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 11:02:39 -0400 (0:00:00.391) 0:15:55.785 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 11:02:39 -0400 (0:00:00.523) 0:15:56.309 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 11:02:40 -0400 (0:00:00.452) 0:15:56.761 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 11:02:42 -0400 (0:00:02.430) 0:15:59.191 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 11:02:43 -0400 (0:00:00.331) 0:15:59.523 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 11:02:43 -0400 (0:00:00.214) 0:15:59.738 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 11:02:43 -0400 (0:00:00.231) 0:15:59.969 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 11:02:43 -0400 (0:00:00.289) 0:16:00.259 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 11:02:44 -0400 (0:00:00.317) 0:16:00.577 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 11:02:44 -0400 (0:00:00.243) 0:16:00.820 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 11:02:44 -0400 (0:00:00.270) 0:16:01.091 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 11:02:46 -0400 (0:00:01.578) 0:16:02.670 *********** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 11:02:46 -0400 (0:00:00.258) 0:16:02.928 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 11:02:46 -0400 (0:00:00.456) 0:16:03.385 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 11:02:47 -0400 (0:00:00.279) 0:16:03.664 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 11:02:47 -0400 (0:00:00.369) 0:16:04.033 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 11:02:47 -0400 (0:00:00.292) 0:16:04.326 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 11:02:48 -0400 (0:00:00.409) 0:16:04.736 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 11:02:48 -0400 (0:00:00.390) 0:16:05.126 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 11:02:48 -0400 (0:00:00.171) 0:16:05.298 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 11:02:49 -0400 (0:00:00.227) 0:16:05.526 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 11:02:49 -0400 (0:00:00.158) 0:16:05.685 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 11:02:49 -0400 (0:00:00.235) 0:16:05.920 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 11:02:49 -0400 (0:00:00.171) 0:16:06.091 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 11:02:49 -0400 (0:00:00.141) 0:16:06.233 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 11:02:50 -0400 (0:00:00.367) 0:16:06.601 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 13 May 2025 11:02:51 -0400 (0:00:00.877) 0:16:07.478 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 13 May 2025 11:02:51 -0400 (0:00:00.202) 0:16:07.680 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 13 May 2025 11:02:51 -0400 (0:00:00.198) 0:16:07.879 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 13 May 2025 11:02:51 -0400 (0:00:00.253) 0:16:08.132 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 13 May 2025 11:02:51 -0400 (0:00:00.208) 0:16:08.341 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 13 May 2025 11:02:52 -0400 (0:00:00.185) 0:16:08.526 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 13 May 2025 11:02:52 -0400 (0:00:00.344) 0:16:08.871 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 11:02:52 -0400 (0:00:00.282) 0:16:09.153 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 11:02:53 -0400 (0:00:00.496) 0:16:09.650 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 13 May 2025 11:02:53 -0400 (0:00:00.321) 0:16:09.971 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 13 May 2025 11:02:53 -0400 (0:00:00.307) 0:16:10.279 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 13 May 2025 11:02:54 -0400 (0:00:00.161) 0:16:10.441 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 13 May 2025 11:02:54 -0400 (0:00:00.496) 0:16:10.937 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 11:02:54 -0400 (0:00:00.208) 0:16:11.145 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 11:02:55 -0400 (0:00:00.711) 0:16:11.857 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 11:02:55 -0400 (0:00:00.216) 0:16:12.074 *********** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 11:02:55 -0400 (0:00:00.319) 0:16:12.393 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 13 May 2025 11:02:56 -0400 (0:00:00.495) 0:16:12.889 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 13 May 2025 11:02:56 -0400 (0:00:00.229) 0:16:13.118 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 13 May 2025 11:02:56 -0400 (0:00:00.272) 0:16:13.391 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 13 May 2025 11:02:57 -0400 (0:00:00.260) 0:16:13.651 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 13 May 2025 11:02:57 -0400 (0:00:00.338) 0:16:13.989 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 13 May 2025 11:02:57 -0400 (0:00:00.281) 0:16:14.270 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 11:02:58 -0400 (0:00:00.263) 0:16:14.533 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 11:02:58 -0400 (0:00:00.168) 0:16:14.702 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 11:02:58 -0400 (0:00:00.553) 0:16:15.256 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 13 May 2025 11:02:59 -0400 (0:00:00.410) 0:16:15.667 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 13 May 2025 11:02:59 -0400 (0:00:00.301) 0:16:15.969 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 13 May 2025 11:02:59 -0400 (0:00:00.290) 0:16:16.259 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 13 May 2025 11:03:00 -0400 (0:00:00.344) 0:16:16.604 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 13 May 2025 11:03:00 -0400 (0:00:00.367) 0:16:16.971 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 13 May 2025 11:03:00 -0400 (0:00:00.223) 0:16:17.195 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 13 May 2025 11:03:01 -0400 (0:00:00.304) 0:16:17.500 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 11:03:01 -0400 (0:00:00.310) 0:16:17.810 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 11:03:02 -0400 (0:00:00.702) 0:16:18.513 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 11:03:02 -0400 (0:00:00.307) 0:16:18.821 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 11:03:02 -0400 (0:00:00.291) 0:16:19.112 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 11:03:02 -0400 (0:00:00.155) 0:16:19.267 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 11:03:03 -0400 (0:00:00.200) 0:16:19.468 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 11:03:03 -0400 (0:00:00.202) 0:16:19.670 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 11:03:03 -0400 (0:00:00.219) 0:16:19.890 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 11:03:03 -0400 (0:00:00.140) 0:16:20.031 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 11:03:03 -0400 (0:00:00.175) 0:16:20.206 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:03:04 -0400 (0:00:00.971) 0:16:21.178 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:03:05 -0400 (0:00:00.265) 0:16:21.443 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:03:06 -0400 (0:00:01.134) 0:16:22.578 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:03:06 -0400 (0:00:00.243) 0:16:22.821 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:03:06 -0400 (0:00:00.330) 0:16:23.152 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:03:07 -0400 (0:00:00.308) 0:16:23.460 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:03:07 -0400 (0:00:00.320) 0:16:23.781 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:03:07 -0400 (0:00:00.269) 0:16:24.051 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:03:07 -0400 (0:00:00.268) 0:16:24.320 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:03:08 -0400 (0:00:00.265) 0:16:24.585 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:03:08 -0400 (0:00:00.344) 0:16:24.929 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:03:08 -0400 (0:00:00.340) 0:16:25.270 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:03:09 -0400 (0:00:00.326) 0:16:25.597 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:03:09 -0400 (0:00:00.175) 0:16:25.772 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:03:09 -0400 (0:00:00.363) 0:16:26.136 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:03:09 -0400 (0:00:00.292) 0:16:26.428 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:03:10 -0400 (0:00:00.216) 0:16:26.645 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:03:10 -0400 (0:00:00.239) 0:16:26.884 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:03:10 -0400 (0:00:00.232) 0:16:27.117 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:03:10 -0400 (0:00:00.220) 0:16:27.337 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:03:11 -0400 (0:00:00.385) 0:16:27.722 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:03:11 -0400 (0:00:00.353) 0:16:28.076 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148531.9390354, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148531.9390354, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 214950, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148531.9390354, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:03:13 -0400 (0:00:01.570) 0:16:29.646 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:03:13 -0400 (0:00:00.263) 0:16:29.910 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:03:13 -0400 (0:00:00.276) 0:16:30.186 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:03:14 -0400 (0:00:00.489) 0:16:30.676 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:03:14 -0400 (0:00:00.322) 0:16:30.998 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:03:14 -0400 (0:00:00.258) 0:16:31.257 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:03:15 -0400 (0:00:00.227) 0:16:31.484 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148532.0910356, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148532.0910356, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217111, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148532.0910356, "nlink": 1, "path": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:03:16 -0400 (0:00:01.512) 0:16:32.997 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:03:21 -0400 (0:00:04.556) 0:16:37.554 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010140", "end": "2025-05-13 11:03:22.375336", "rc": 0, "start": "2025-05-13 11:03:22.365196" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 6a e2 dd ad 0e 67 f2 f3 58 c3 8d b1 ec e5 6a 6f 59 01 33 67 MK salt: 2a b1 d1 b8 2c 7c 37 bb a6 f6 f6 5f 67 f5 9b 28 1f e1 ff e7 80 bb c8 9d b5 35 13 6f 8f 4e 3a 81 MK iterations: 120470 UUID: f464edab-7000-480c-9b25-609fde3e3013 Key Slot 0: ENABLED Iterations: 1923992 Salt: d7 31 cd e5 69 16 12 29 54 c8 d6 db f3 71 e3 94 74 c5 88 0b ae 8b 5e 0f 49 b3 29 25 bc ea d8 6c Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:03:22 -0400 (0:00:01.514) 0:16:39.069 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:03:23 -0400 (0:00:00.383) 0:16:39.452 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:03:23 -0400 (0:00:00.443) 0:16:39.895 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:03:23 -0400 (0:00:00.326) 0:16:40.222 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:03:24 -0400 (0:00:00.327) 0:16:40.550 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:03:24 -0400 (0:00:00.391) 0:16:40.942 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:03:24 -0400 (0:00:00.239) 0:16:41.181 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:03:24 -0400 (0:00:00.231) 0:16:41.413 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:03:25 -0400 (0:00:00.844) 0:16:42.258 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:03:26 -0400 (0:00:00.242) 0:16:42.500 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:03:26 -0400 (0:00:00.155) 0:16:42.655 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:03:26 -0400 (0:00:00.357) 0:16:43.013 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:03:26 -0400 (0:00:00.300) 0:16:43.313 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:03:27 -0400 (0:00:00.162) 0:16:43.476 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:03:27 -0400 (0:00:00.343) 0:16:43.819 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:03:27 -0400 (0:00:00.229) 0:16:44.049 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:03:27 -0400 (0:00:00.320) 0:16:44.370 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:03:28 -0400 (0:00:00.225) 0:16:44.596 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:03:28 -0400 (0:00:00.262) 0:16:44.858 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:03:28 -0400 (0:00:00.275) 0:16:45.133 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:03:29 -0400 (0:00:00.307) 0:16:45.441 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:03:29 -0400 (0:00:00.238) 0:16:45.680 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:03:29 -0400 (0:00:00.283) 0:16:45.963 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:03:29 -0400 (0:00:00.269) 0:16:46.233 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:03:32 -0400 (0:00:02.634) 0:16:48.868 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:03:34 -0400 (0:00:01.687) 0:16:50.555 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:03:34 -0400 (0:00:00.304) 0:16:50.860 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:03:34 -0400 (0:00:00.259) 0:16:51.119 *********** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:03:36 -0400 (0:00:01.793) 0:16:52.912 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:03:36 -0400 (0:00:00.318) 0:16:53.230 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:03:37 -0400 (0:00:00.229) 0:16:53.460 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:03:37 -0400 (0:00:00.269) 0:16:53.730 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:03:37 -0400 (0:00:00.308) 0:16:54.038 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:03:37 -0400 (0:00:00.177) 0:16:54.216 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:03:38 -0400 (0:00:00.251) 0:16:54.468 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:03:38 -0400 (0:00:00.295) 0:16:54.764 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:03:38 -0400 (0:00:00.262) 0:16:55.027 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:03:38 -0400 (0:00:00.235) 0:16:55.262 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:03:39 -0400 (0:00:00.218) 0:16:55.480 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:03:39 -0400 (0:00:00.195) 0:16:55.676 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:03:39 -0400 (0:00:00.282) 0:16:55.958 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:03:39 -0400 (0:00:00.299) 0:16:56.258 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:03:40 -0400 (0:00:00.302) 0:16:56.560 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:03:40 -0400 (0:00:00.358) 0:16:56.918 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:03:40 -0400 (0:00:00.287) 0:16:57.206 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:03:41 -0400 (0:00:00.297) 0:16:57.503 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:03:41 -0400 (0:00:00.256) 0:16:57.759 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:03:41 -0400 (0:00:00.173) 0:16:57.933 *********** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:03:41 -0400 (0:00:00.249) 0:16:58.183 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:03:41 -0400 (0:00:00.240) 0:16:58.424 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:03:42 -0400 (0:00:00.329) 0:16:58.753 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024561", "end": "2025-05-13 11:03:43.638924", "rc": 0, "start": "2025-05-13 11:03:43.614363" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:03:43 -0400 (0:00:01.593) 0:17:00.347 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:03:44 -0400 (0:00:00.221) 0:17:00.569 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:03:44 -0400 (0:00:00.294) 0:17:00.863 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:03:44 -0400 (0:00:00.282) 0:17:01.146 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:03:44 -0400 (0:00:00.275) 0:17:01.421 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:03:45 -0400 (0:00:00.279) 0:17:01.701 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:03:45 -0400 (0:00:00.215) 0:17:01.917 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:03:45 -0400 (0:00:00.281) 0:17:02.199 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:03:45 -0400 (0:00:00.167) 0:17:02.366 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:396 Tuesday 13 May 2025 11:03:46 -0400 (0:00:00.170) 0:17:02.537 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:03:46 -0400 (0:00:00.389) 0:17:02.927 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:03:47 -0400 (0:00:01.044) 0:17:03.971 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:03:47 -0400 (0:00:00.280) 0:17:04.252 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:03:48 -0400 (0:00:00.479) 0:17:04.732 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:03:48 -0400 (0:00:00.267) 0:17:04.999 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:03:48 -0400 (0:00:00.198) 0:17:05.197 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:03:48 -0400 (0:00:00.139) 0:17:05.337 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:03:49 -0400 (0:00:00.289) 0:17:05.626 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:03:49 -0400 (0:00:00.423) 0:17:06.050 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:03:54 -0400 (0:00:04.567) 0:17:10.617 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:03:54 -0400 (0:00:00.205) 0:17:10.823 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:03:54 -0400 (0:00:00.267) 0:17:11.090 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:04:00 -0400 (0:00:05.582) 0:17:16.672 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:04:00 -0400 (0:00:00.388) 0:17:17.061 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:04:00 -0400 (0:00:00.194) 0:17:17.256 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:04:01 -0400 (0:00:00.322) 0:17:17.578 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:04:01 -0400 (0:00:00.275) 0:17:17.853 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:04:06 -0400 (0:00:04.828) 0:17:22.682 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service": { "name": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service": { "name": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:04:09 -0400 (0:00:03.079) 0:17:25.761 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:04:09 -0400 (0:00:00.373) 0:17:26.134 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d92d5456b\x2d8615\x2d442e\x2da019\x2d3986f12860bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "name": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-sda1.device systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-92d5456b-8615-442e-a019-3986f12860bb", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-92d5456b-8615-442e-a019-3986f12860bb /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-92d5456b-8615-442e-a019-3986f12860bb ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:02:24 EDT", "StateChangeTimestampMonotonic": "2702459834", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d8615\x2d442e\x2da019\x2d3986f12860bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "name": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:04:13 -0400 (0:00:03.474) 0:17:29.609 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 11:04:18 -0400 (0:00:05.722) 0:17:35.332 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 11:04:19 -0400 (0:00:00.264) 0:17:35.596 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148542.229051, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a9b2e05858d5ce987640d10181a613dd2d9428af", "ctime": 1747148542.226051, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148542.226051, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 11:04:20 -0400 (0:00:01.546) 0:17:37.143 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:04:21 -0400 (0:00:00.308) 0:17:37.451 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2d92d5456b\x2d8615\x2d442e\x2da019\x2d3986f12860bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "name": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d92d5456b\\x2d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d8615\x2d442e\x2da019\x2d3986f12860bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "name": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8615\\x2d442e\\x2da019\\x2d3986f12860bb.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 11:04:24 -0400 (0:00:03.376) 0:17:40.827 *********** ok: [managed-node8] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 11:04:24 -0400 (0:00:00.208) 0:17:41.036 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 11:04:24 -0400 (0:00:00.202) 0:17:41.239 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 11:04:25 -0400 (0:00:00.219) 0:17:41.458 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 11:04:25 -0400 (0:00:00.225) 0:17:41.684 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 11:04:27 -0400 (0:00:02.032) 0:17:43.716 *********** ok: [managed-node8] => (item={'src': '/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 11:04:29 -0400 (0:00:02.393) 0:17:46.110 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:04:30 -0400 (0:00:00.350) 0:17:46.461 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:04:31 -0400 (0:00:01.862) 0:17:48.323 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148556.489073, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1fc12588c0982e5b295b191e3ee0297832b97702", "ctime": 1747148548.8770614, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 220201157, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747148548.8770614, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3620939784", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:04:33 -0400 (0:00:01.428) 0:17:49.752 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:04:33 -0400 (0:00:00.240) 0:17:49.993 *********** ok: [managed-node8] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:410 Tuesday 13 May 2025 11:04:35 -0400 (0:00:02.009) 0:17:52.002 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Tuesday 13 May 2025 11:04:35 -0400 (0:00:00.302) 0:17:52.305 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:04:36 -0400 (0:00:00.406) 0:17:52.711 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:04:36 -0400 (0:00:00.152) 0:17:52.864 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:04:36 -0400 (0:00:00.282) 0:17:53.146 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f464edab-7000-480c-9b25-609fde3e3013" }, "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "size": "4G", "type": "crypt", "uuid": "e631b1fe-c95e-4959-af84-c0ae18278056" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:04:38 -0400 (0:00:01.588) 0:17:54.735 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003157", "end": "2025-05-13 11:04:39.559946", "rc": 0, "start": "2025-05-13 11:04:39.556789" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:04:39 -0400 (0:00:01.627) 0:17:56.362 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003021", "end": "2025-05-13 11:04:41.097818", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:04:41.094797" } STDOUT: luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:04:41 -0400 (0:00:01.420) 0:17:57.783 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 11:04:41 -0400 (0:00:00.375) 0:17:58.158 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 11:04:42 -0400 (0:00:00.320) 0:17:58.479 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.028759", "end": "2025-05-13 11:04:43.317662", "rc": 0, "start": "2025-05-13 11:04:43.288903" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 11:04:43 -0400 (0:00:01.717) 0:18:00.197 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 11:04:44 -0400 (0:00:00.338) 0:18:00.536 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 11:04:44 -0400 (0:00:00.673) 0:18:01.210 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 11:04:45 -0400 (0:00:00.398) 0:18:01.608 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 11:04:46 -0400 (0:00:01.615) 0:18:03.223 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 11:04:47 -0400 (0:00:00.283) 0:18:03.507 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 11:04:47 -0400 (0:00:00.333) 0:18:03.841 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 11:04:47 -0400 (0:00:00.323) 0:18:04.164 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 11:04:48 -0400 (0:00:00.294) 0:18:04.458 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 11:04:48 -0400 (0:00:00.361) 0:18:04.820 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 11:04:48 -0400 (0:00:00.270) 0:18:05.090 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 11:04:48 -0400 (0:00:00.331) 0:18:05.422 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 11:04:50 -0400 (0:00:01.557) 0:18:06.980 *********** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 11:04:50 -0400 (0:00:00.251) 0:18:07.231 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 11:04:51 -0400 (0:00:00.639) 0:18:07.871 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 11:04:51 -0400 (0:00:00.278) 0:18:08.149 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 11:04:52 -0400 (0:00:00.315) 0:18:08.465 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 11:04:52 -0400 (0:00:00.242) 0:18:08.708 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 11:04:52 -0400 (0:00:00.240) 0:18:08.949 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 11:04:53 -0400 (0:00:00.783) 0:18:09.732 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 11:04:53 -0400 (0:00:00.311) 0:18:10.043 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 11:04:53 -0400 (0:00:00.310) 0:18:10.354 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 11:04:54 -0400 (0:00:00.244) 0:18:10.598 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 11:04:54 -0400 (0:00:00.208) 0:18:10.807 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 11:04:54 -0400 (0:00:00.305) 0:18:11.112 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 11:04:54 -0400 (0:00:00.276) 0:18:11.388 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 11:04:55 -0400 (0:00:00.530) 0:18:11.919 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 13 May 2025 11:04:56 -0400 (0:00:00.572) 0:18:12.492 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 13 May 2025 11:04:56 -0400 (0:00:00.367) 0:18:12.859 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 13 May 2025 11:04:56 -0400 (0:00:00.296) 0:18:13.156 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 13 May 2025 11:04:57 -0400 (0:00:00.347) 0:18:13.504 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 13 May 2025 11:04:57 -0400 (0:00:00.261) 0:18:13.766 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 13 May 2025 11:04:57 -0400 (0:00:00.179) 0:18:13.946 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 13 May 2025 11:04:57 -0400 (0:00:00.272) 0:18:14.218 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 11:04:58 -0400 (0:00:00.317) 0:18:14.535 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 11:04:58 -0400 (0:00:00.507) 0:18:15.042 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 13 May 2025 11:04:59 -0400 (0:00:00.436) 0:18:15.479 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 13 May 2025 11:04:59 -0400 (0:00:00.326) 0:18:15.805 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 13 May 2025 11:04:59 -0400 (0:00:00.291) 0:18:16.096 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 13 May 2025 11:04:59 -0400 (0:00:00.221) 0:18:16.317 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 11:05:00 -0400 (0:00:00.220) 0:18:16.538 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 11:05:00 -0400 (0:00:00.526) 0:18:17.064 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 11:05:00 -0400 (0:00:00.347) 0:18:17.412 *********** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 11:05:01 -0400 (0:00:00.322) 0:18:17.734 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 13 May 2025 11:05:01 -0400 (0:00:00.453) 0:18:18.187 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 13 May 2025 11:05:02 -0400 (0:00:00.393) 0:18:18.581 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 13 May 2025 11:05:02 -0400 (0:00:00.256) 0:18:18.838 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 13 May 2025 11:05:02 -0400 (0:00:00.200) 0:18:19.038 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 13 May 2025 11:05:02 -0400 (0:00:00.297) 0:18:19.336 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 13 May 2025 11:05:03 -0400 (0:00:00.334) 0:18:19.670 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 11:05:03 -0400 (0:00:00.303) 0:18:19.974 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 11:05:03 -0400 (0:00:00.264) 0:18:20.238 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 11:05:04 -0400 (0:00:00.580) 0:18:20.819 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 13 May 2025 11:05:04 -0400 (0:00:00.511) 0:18:21.330 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 13 May 2025 11:05:05 -0400 (0:00:00.307) 0:18:21.638 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 13 May 2025 11:05:05 -0400 (0:00:00.270) 0:18:21.908 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 13 May 2025 11:05:05 -0400 (0:00:00.269) 0:18:22.178 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 13 May 2025 11:05:05 -0400 (0:00:00.254) 0:18:22.432 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 13 May 2025 11:05:06 -0400 (0:00:00.249) 0:18:22.681 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 13 May 2025 11:05:06 -0400 (0:00:00.719) 0:18:23.401 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 11:05:07 -0400 (0:00:00.289) 0:18:23.691 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 11:05:07 -0400 (0:00:00.661) 0:18:24.352 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 11:05:08 -0400 (0:00:00.413) 0:18:24.765 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 11:05:08 -0400 (0:00:00.339) 0:18:25.105 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 11:05:09 -0400 (0:00:00.344) 0:18:25.449 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 11:05:09 -0400 (0:00:00.290) 0:18:25.740 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 11:05:09 -0400 (0:00:00.273) 0:18:26.014 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 11:05:09 -0400 (0:00:00.265) 0:18:26.279 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 11:05:10 -0400 (0:00:00.241) 0:18:26.521 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 11:05:10 -0400 (0:00:00.149) 0:18:26.670 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:05:10 -0400 (0:00:00.365) 0:18:27.036 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:05:11 -0400 (0:00:00.463) 0:18:27.500 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:05:12 -0400 (0:00:01.393) 0:18:28.893 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:05:12 -0400 (0:00:00.315) 0:18:29.208 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:05:13 -0400 (0:00:00.312) 0:18:29.521 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:05:13 -0400 (0:00:00.347) 0:18:29.869 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:05:13 -0400 (0:00:00.283) 0:18:30.152 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:05:13 -0400 (0:00:00.254) 0:18:30.407 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:05:14 -0400 (0:00:00.210) 0:18:30.617 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:05:14 -0400 (0:00:00.195) 0:18:30.813 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:05:14 -0400 (0:00:00.205) 0:18:31.018 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:05:14 -0400 (0:00:00.211) 0:18:31.230 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:05:15 -0400 (0:00:00.283) 0:18:31.513 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:05:15 -0400 (0:00:00.202) 0:18:31.716 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:05:15 -0400 (0:00:00.423) 0:18:32.139 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:05:15 -0400 (0:00:00.276) 0:18:32.416 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:05:16 -0400 (0:00:00.247) 0:18:32.663 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:05:16 -0400 (0:00:00.184) 0:18:32.848 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:05:16 -0400 (0:00:00.284) 0:18:33.132 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:05:16 -0400 (0:00:00.281) 0:18:33.413 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:05:17 -0400 (0:00:00.367) 0:18:33.781 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:05:17 -0400 (0:00:00.359) 0:18:34.140 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148602.3711433, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148531.9390354, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 214950, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148531.9390354, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:05:19 -0400 (0:00:01.779) 0:18:35.919 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:05:19 -0400 (0:00:00.340) 0:18:36.260 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:05:20 -0400 (0:00:00.272) 0:18:36.532 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:05:20 -0400 (0:00:00.168) 0:18:36.701 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:05:20 -0400 (0:00:00.204) 0:18:36.905 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:05:20 -0400 (0:00:00.163) 0:18:37.068 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:05:20 -0400 (0:00:00.226) 0:18:37.295 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148658.4682295, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148532.0910356, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217111, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148532.0910356, "nlink": 1, "path": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:05:22 -0400 (0:00:01.307) 0:18:38.602 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:05:26 -0400 (0:00:04.310) 0:18:42.913 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009575", "end": "2025-05-13 11:05:27.841774", "rc": 0, "start": "2025-05-13 11:05:27.832199" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 6a e2 dd ad 0e 67 f2 f3 58 c3 8d b1 ec e5 6a 6f 59 01 33 67 MK salt: 2a b1 d1 b8 2c 7c 37 bb a6 f6 f6 5f 67 f5 9b 28 1f e1 ff e7 80 bb c8 9d b5 35 13 6f 8f 4e 3a 81 MK iterations: 120470 UUID: f464edab-7000-480c-9b25-609fde3e3013 Key Slot 0: ENABLED Iterations: 1923992 Salt: d7 31 cd e5 69 16 12 29 54 c8 d6 db f3 71 e3 94 74 c5 88 0b ae 8b 5e 0f 49 b3 29 25 bc ea d8 6c Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:05:28 -0400 (0:00:01.968) 0:18:44.882 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:05:28 -0400 (0:00:00.382) 0:18:45.264 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:05:29 -0400 (0:00:00.339) 0:18:45.604 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:05:29 -0400 (0:00:00.303) 0:18:45.907 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:05:29 -0400 (0:00:00.364) 0:18:46.271 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:05:30 -0400 (0:00:00.407) 0:18:46.679 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:05:30 -0400 (0:00:00.333) 0:18:47.012 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:05:30 -0400 (0:00:00.290) 0:18:47.302 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:05:31 -0400 (0:00:00.377) 0:18:47.680 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:05:31 -0400 (0:00:00.266) 0:18:47.947 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:05:31 -0400 (0:00:00.347) 0:18:48.295 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:05:32 -0400 (0:00:00.377) 0:18:48.672 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:05:32 -0400 (0:00:00.366) 0:18:49.038 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:05:32 -0400 (0:00:00.334) 0:18:49.373 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:05:33 -0400 (0:00:00.266) 0:18:49.640 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:05:33 -0400 (0:00:00.193) 0:18:49.833 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:05:33 -0400 (0:00:00.199) 0:18:50.033 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:05:33 -0400 (0:00:00.287) 0:18:50.320 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:05:34 -0400 (0:00:00.209) 0:18:50.530 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:05:34 -0400 (0:00:00.212) 0:18:50.742 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:05:34 -0400 (0:00:00.261) 0:18:51.004 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:05:34 -0400 (0:00:00.257) 0:18:51.261 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:05:35 -0400 (0:00:00.211) 0:18:51.473 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:05:35 -0400 (0:00:00.144) 0:18:51.617 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:05:36 -0400 (0:00:01.483) 0:18:53.100 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:05:38 -0400 (0:00:01.691) 0:18:54.791 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:05:38 -0400 (0:00:00.279) 0:18:55.071 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:05:38 -0400 (0:00:00.213) 0:18:55.285 *********** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:05:40 -0400 (0:00:01.703) 0:18:56.989 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:05:40 -0400 (0:00:00.246) 0:18:57.235 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:05:41 -0400 (0:00:00.315) 0:18:57.551 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:05:41 -0400 (0:00:00.272) 0:18:57.823 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:05:41 -0400 (0:00:00.270) 0:18:58.093 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:05:42 -0400 (0:00:00.413) 0:18:58.507 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:05:42 -0400 (0:00:00.355) 0:18:58.863 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:05:42 -0400 (0:00:00.377) 0:18:59.240 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:05:43 -0400 (0:00:00.297) 0:18:59.537 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:05:43 -0400 (0:00:00.301) 0:18:59.838 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:05:43 -0400 (0:00:00.225) 0:19:00.063 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:05:43 -0400 (0:00:00.301) 0:19:00.365 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:05:44 -0400 (0:00:00.198) 0:19:00.564 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:05:44 -0400 (0:00:00.207) 0:19:00.771 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:05:44 -0400 (0:00:00.194) 0:19:00.966 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:05:44 -0400 (0:00:00.177) 0:19:01.144 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:05:44 -0400 (0:00:00.208) 0:19:01.352 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:05:45 -0400 (0:00:00.151) 0:19:01.504 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:05:45 -0400 (0:00:00.222) 0:19:01.726 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:05:45 -0400 (0:00:00.267) 0:19:01.994 *********** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:05:45 -0400 (0:00:00.292) 0:19:02.286 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:05:46 -0400 (0:00:00.197) 0:19:02.483 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:05:46 -0400 (0:00:00.241) 0:19:02.725 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024772", "end": "2025-05-13 11:05:47.498720", "rc": 0, "start": "2025-05-13 11:05:47.473948" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:05:47 -0400 (0:00:01.578) 0:19:04.303 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:05:48 -0400 (0:00:00.314) 0:19:04.618 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:05:48 -0400 (0:00:00.322) 0:19:04.941 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:05:48 -0400 (0:00:00.270) 0:19:05.211 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:05:48 -0400 (0:00:00.174) 0:19:05.385 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:05:49 -0400 (0:00:00.197) 0:19:05.583 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:05:50 -0400 (0:00:00.862) 0:19:06.445 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:05:50 -0400 (0:00:00.191) 0:19:06.637 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:05:50 -0400 (0:00:00.256) 0:19:06.894 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 11:05:50 -0400 (0:00:00.201) 0:19:07.095 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:423 Tuesday 13 May 2025 11:05:52 -0400 (0:00:01.477) 0:19:08.572 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 11:05:52 -0400 (0:00:00.347) 0:19:08.920 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 11:05:52 -0400 (0:00:00.299) 0:19:09.219 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:05:53 -0400 (0:00:00.341) 0:19:09.560 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:05:53 -0400 (0:00:00.588) 0:19:10.149 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:05:54 -0400 (0:00:00.366) 0:19:10.515 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:05:54 -0400 (0:00:00.629) 0:19:11.145 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:05:54 -0400 (0:00:00.255) 0:19:11.401 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:05:55 -0400 (0:00:00.153) 0:19:11.555 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:05:55 -0400 (0:00:00.122) 0:19:11.677 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:05:55 -0400 (0:00:00.121) 0:19:11.798 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:05:55 -0400 (0:00:00.559) 0:19:12.358 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:06:00 -0400 (0:00:04.468) 0:19:16.826 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:06:00 -0400 (0:00:00.281) 0:19:17.108 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:06:00 -0400 (0:00:00.174) 0:19:17.282 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:06:06 -0400 (0:00:05.524) 0:19:22.807 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:06:06 -0400 (0:00:00.485) 0:19:23.292 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:06:07 -0400 (0:00:00.160) 0:19:23.453 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:06:07 -0400 (0:00:00.223) 0:19:23.676 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:06:07 -0400 (0:00:00.138) 0:19:23.814 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:06:12 -0400 (0:00:04.918) 0:19:28.733 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:06:15 -0400 (0:00:02.742) 0:19:31.476 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:06:15 -0400 (0:00:00.333) 0:19:31.809 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f464edab-7000-480c-9b25-609fde3e3013", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f464edab-7000-480c-9b25-609fde3e3013 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:04:12 EDT", "StateChangeTimestampMonotonic": "2810907941", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:06:19 -0400 (0:00:03.754) 0:19:35.564 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-f464edab-7000-480c-9b25-609fde3e3013' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 11:06:24 -0400 (0:00:05.199) 0:19:40.763 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-f464edab-7000-480c-9b25-609fde3e3013' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:06:24 -0400 (0:00:00.253) 0:19:41.016 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:04:12 EDT", "StateChangeTimestampMonotonic": "2810907941", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 11:06:28 -0400 (0:00:03.425) 0:19:44.442 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 11:06:28 -0400 (0:00:00.228) 0:19:44.671 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 11:06:28 -0400 (0:00:00.341) 0:19:45.012 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 11:06:28 -0400 (0:00:00.311) 0:19:45.324 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148751.911373, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148751.911373, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747148751.911373, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2368352559", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 11:06:30 -0400 (0:00:01.779) 0:19:47.103 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:446 Tuesday 13 May 2025 11:06:30 -0400 (0:00:00.212) 0:19:47.316 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:06:31 -0400 (0:00:00.693) 0:19:48.010 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:06:31 -0400 (0:00:00.305) 0:19:48.316 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:06:32 -0400 (0:00:00.294) 0:19:48.610 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:06:32 -0400 (0:00:00.628) 0:19:49.239 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:06:33 -0400 (0:00:00.330) 0:19:49.570 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:06:33 -0400 (0:00:00.251) 0:19:49.821 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:06:33 -0400 (0:00:00.224) 0:19:50.046 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:06:33 -0400 (0:00:00.244) 0:19:50.290 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:06:34 -0400 (0:00:00.749) 0:19:51.040 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:06:39 -0400 (0:00:04.504) 0:19:55.545 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:06:39 -0400 (0:00:00.305) 0:19:55.850 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:06:39 -0400 (0:00:00.217) 0:19:56.068 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:06:45 -0400 (0:00:05.405) 0:20:01.473 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:06:45 -0400 (0:00:00.419) 0:20:01.893 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:06:45 -0400 (0:00:00.174) 0:20:02.068 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:06:45 -0400 (0:00:00.362) 0:20:02.430 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:06:46 -0400 (0:00:00.201) 0:20:02.632 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:06:51 -0400 (0:00:04.871) 0:20:07.503 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:06:54 -0400 (0:00:03.645) 0:20:11.148 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:06:55 -0400 (0:00:00.788) 0:20:11.937 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f464edab-7000-480c-9b25-609fde3e3013", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f464edab-7000-480c-9b25-609fde3e3013 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:04:12 EDT", "StateChangeTimestampMonotonic": "2810907941", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:06:59 -0400 (0:00:03.668) 0:20:15.605 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 11:07:05 -0400 (0:00:06.193) 0:20:21.798 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 11:07:05 -0400 (0:00:00.265) 0:20:22.064 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148542.229051, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a9b2e05858d5ce987640d10181a613dd2d9428af", "ctime": 1747148542.226051, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148542.226051, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 11:07:07 -0400 (0:00:01.917) 0:20:23.981 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:07:09 -0400 (0:00:01.586) 0:20:25.568 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:04:12 EDT", "StateChangeTimestampMonotonic": "2810907941", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 11:07:12 -0400 (0:00:03.498) 0:20:29.067 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 11:07:12 -0400 (0:00:00.264) 0:20:29.332 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 11:07:13 -0400 (0:00:00.315) 0:20:29.647 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 11:07:13 -0400 (0:00:00.322) 0:20:29.970 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f464edab-7000-480c-9b25-609fde3e3013" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 11:07:15 -0400 (0:00:01.741) 0:20:31.712 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 11:07:17 -0400 (0:00:01.802) 0:20:33.514 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 11:07:18 -0400 (0:00:01.763) 0:20:35.277 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:07:19 -0400 (0:00:00.369) 0:20:35.647 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:07:20 -0400 (0:00:01.475) 0:20:37.122 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148556.489073, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1fc12588c0982e5b295b191e3ee0297832b97702", "ctime": 1747148548.8770614, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 220201157, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747148548.8770614, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3620939784", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:07:21 -0400 (0:00:01.261) 0:20:38.383 *********** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f464edab-7000-480c-9b25-609fde3e3013', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f464edab-7000-480c-9b25-609fde3e3013", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:07:23 -0400 (0:00:01.476) 0:20:39.860 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:462 Tuesday 13 May 2025 11:07:25 -0400 (0:00:01.698) 0:20:41.558 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:07:25 -0400 (0:00:00.631) 0:20:42.190 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:07:25 -0400 (0:00:00.217) 0:20:42.407 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:07:26 -0400 (0:00:00.216) 0:20:42.624 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "b5170eb2-562f-49c2-8d5d-ecee8c034708" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:07:27 -0400 (0:00:01.391) 0:20:44.016 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002739", "end": "2025-05-13 11:07:28.664714", "rc": 0, "start": "2025-05-13 11:07:28.661975" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:07:28 -0400 (0:00:01.418) 0:20:45.435 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002880", "end": "2025-05-13 11:07:30.236681", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:07:30.233801" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:07:30 -0400 (0:00:01.541) 0:20:46.976 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 11:07:30 -0400 (0:00:00.388) 0:20:47.364 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 11:07:31 -0400 (0:00:00.331) 0:20:47.696 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023213", "end": "2025-05-13 11:07:32.458616", "rc": 0, "start": "2025-05-13 11:07:32.435403" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 11:07:32 -0400 (0:00:01.518) 0:20:49.214 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 11:07:33 -0400 (0:00:00.321) 0:20:49.535 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 11:07:33 -0400 (0:00:00.459) 0:20:49.995 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 11:07:33 -0400 (0:00:00.399) 0:20:50.394 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 11:07:35 -0400 (0:00:01.557) 0:20:51.951 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 11:07:35 -0400 (0:00:00.281) 0:20:52.233 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 11:07:36 -0400 (0:00:00.288) 0:20:52.522 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 11:07:36 -0400 (0:00:00.251) 0:20:52.773 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 11:07:36 -0400 (0:00:00.277) 0:20:53.051 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 11:07:36 -0400 (0:00:00.220) 0:20:53.271 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 11:07:37 -0400 (0:00:00.269) 0:20:53.541 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 11:07:37 -0400 (0:00:00.329) 0:20:53.871 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 11:07:38 -0400 (0:00:01.355) 0:20:55.227 *********** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 11:07:39 -0400 (0:00:00.212) 0:20:55.440 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 11:07:39 -0400 (0:00:00.372) 0:20:55.812 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 11:07:39 -0400 (0:00:00.197) 0:20:56.009 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 11:07:39 -0400 (0:00:00.267) 0:20:56.277 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 11:07:40 -0400 (0:00:00.271) 0:20:56.549 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 11:07:40 -0400 (0:00:00.205) 0:20:56.754 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 11:07:40 -0400 (0:00:00.290) 0:20:57.045 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 11:07:40 -0400 (0:00:00.306) 0:20:57.351 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 11:07:41 -0400 (0:00:00.273) 0:20:57.625 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 11:07:41 -0400 (0:00:00.257) 0:20:57.883 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 11:07:41 -0400 (0:00:00.293) 0:20:58.176 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 11:07:42 -0400 (0:00:00.314) 0:20:58.491 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 11:07:42 -0400 (0:00:00.229) 0:20:58.720 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 11:07:42 -0400 (0:00:00.546) 0:20:59.267 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 13 May 2025 11:07:43 -0400 (0:00:00.489) 0:20:59.756 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 13 May 2025 11:07:43 -0400 (0:00:00.230) 0:20:59.986 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 13 May 2025 11:07:43 -0400 (0:00:00.232) 0:21:00.219 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 13 May 2025 11:07:44 -0400 (0:00:00.240) 0:21:00.460 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 13 May 2025 11:07:44 -0400 (0:00:00.895) 0:21:01.355 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 13 May 2025 11:07:45 -0400 (0:00:00.246) 0:21:01.602 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 13 May 2025 11:07:45 -0400 (0:00:00.286) 0:21:01.888 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 11:07:45 -0400 (0:00:00.261) 0:21:02.149 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 11:07:46 -0400 (0:00:00.527) 0:21:02.677 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 13 May 2025 11:07:46 -0400 (0:00:00.443) 0:21:03.120 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 13 May 2025 11:07:46 -0400 (0:00:00.264) 0:21:03.385 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 13 May 2025 11:07:47 -0400 (0:00:00.391) 0:21:03.776 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 13 May 2025 11:07:47 -0400 (0:00:00.139) 0:21:03.916 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 11:07:47 -0400 (0:00:00.195) 0:21:04.112 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 11:07:48 -0400 (0:00:00.339) 0:21:04.451 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 11:07:48 -0400 (0:00:00.160) 0:21:04.612 *********** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 11:07:48 -0400 (0:00:00.320) 0:21:04.932 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 13 May 2025 11:07:48 -0400 (0:00:00.473) 0:21:05.406 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 13 May 2025 11:07:49 -0400 (0:00:00.410) 0:21:05.817 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 13 May 2025 11:07:49 -0400 (0:00:00.275) 0:21:06.093 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 13 May 2025 11:07:49 -0400 (0:00:00.289) 0:21:06.383 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 13 May 2025 11:07:50 -0400 (0:00:00.237) 0:21:06.621 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 13 May 2025 11:07:50 -0400 (0:00:00.275) 0:21:06.896 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 11:07:50 -0400 (0:00:00.143) 0:21:07.039 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 11:07:50 -0400 (0:00:00.129) 0:21:07.169 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 11:07:51 -0400 (0:00:00.562) 0:21:07.732 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 13 May 2025 11:07:51 -0400 (0:00:00.396) 0:21:08.128 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 13 May 2025 11:07:52 -0400 (0:00:00.341) 0:21:08.470 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 13 May 2025 11:07:52 -0400 (0:00:00.268) 0:21:08.738 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 13 May 2025 11:07:52 -0400 (0:00:00.270) 0:21:09.009 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 13 May 2025 11:07:52 -0400 (0:00:00.214) 0:21:09.223 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 13 May 2025 11:07:53 -0400 (0:00:00.265) 0:21:09.488 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 13 May 2025 11:07:53 -0400 (0:00:00.188) 0:21:09.677 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 11:07:53 -0400 (0:00:00.208) 0:21:09.886 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 11:07:54 -0400 (0:00:00.778) 0:21:10.665 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 11:07:54 -0400 (0:00:00.258) 0:21:10.923 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 11:07:54 -0400 (0:00:00.303) 0:21:11.227 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 11:07:55 -0400 (0:00:00.215) 0:21:11.443 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 11:07:55 -0400 (0:00:00.184) 0:21:11.627 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 11:07:55 -0400 (0:00:00.287) 0:21:11.915 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 11:07:55 -0400 (0:00:00.274) 0:21:12.189 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 11:07:55 -0400 (0:00:00.197) 0:21:12.386 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 11:07:56 -0400 (0:00:00.177) 0:21:12.564 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:07:56 -0400 (0:00:00.420) 0:21:12.984 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:07:56 -0400 (0:00:00.310) 0:21:13.294 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:07:58 -0400 (0:00:01.986) 0:21:15.281 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:07:59 -0400 (0:00:00.290) 0:21:15.571 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:07:59 -0400 (0:00:00.256) 0:21:15.828 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:07:59 -0400 (0:00:00.142) 0:21:15.971 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:07:59 -0400 (0:00:00.318) 0:21:16.290 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:08:00 -0400 (0:00:00.267) 0:21:16.557 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:08:00 -0400 (0:00:00.214) 0:21:16.771 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:08:00 -0400 (0:00:00.306) 0:21:17.077 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:08:00 -0400 (0:00:00.242) 0:21:17.319 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:08:01 -0400 (0:00:00.245) 0:21:17.565 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:08:01 -0400 (0:00:00.302) 0:21:17.867 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:08:01 -0400 (0:00:00.303) 0:21:18.171 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:08:02 -0400 (0:00:00.612) 0:21:18.784 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:08:02 -0400 (0:00:00.340) 0:21:19.124 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:08:02 -0400 (0:00:00.309) 0:21:19.433 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:08:03 -0400 (0:00:00.343) 0:21:19.777 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:08:03 -0400 (0:00:00.215) 0:21:19.992 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:08:03 -0400 (0:00:00.250) 0:21:20.243 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:08:04 -0400 (0:00:00.318) 0:21:20.561 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:08:04 -0400 (0:00:00.279) 0:21:20.841 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148825.044485, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148825.044485, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 246576, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148825.044485, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:08:05 -0400 (0:00:01.511) 0:21:22.352 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:08:06 -0400 (0:00:00.216) 0:21:22.569 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:08:06 -0400 (0:00:00.230) 0:21:22.800 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:08:06 -0400 (0:00:00.249) 0:21:23.049 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:08:06 -0400 (0:00:00.338) 0:21:23.387 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:08:07 -0400 (0:00:00.267) 0:21:23.654 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:08:07 -0400 (0:00:00.214) 0:21:23.869 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:08:07 -0400 (0:00:00.200) 0:21:24.070 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:08:12 -0400 (0:00:04.760) 0:21:28.830 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:08:12 -0400 (0:00:00.139) 0:21:28.969 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:08:12 -0400 (0:00:00.308) 0:21:29.278 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:08:13 -0400 (0:00:00.354) 0:21:29.633 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:08:13 -0400 (0:00:00.297) 0:21:29.930 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:08:13 -0400 (0:00:00.264) 0:21:30.195 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:08:14 -0400 (0:00:00.345) 0:21:30.541 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:08:14 -0400 (0:00:00.280) 0:21:30.822 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:08:14 -0400 (0:00:00.259) 0:21:31.081 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:08:15 -0400 (0:00:00.397) 0:21:31.479 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:08:15 -0400 (0:00:00.292) 0:21:31.771 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:08:15 -0400 (0:00:00.269) 0:21:32.040 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:08:15 -0400 (0:00:00.325) 0:21:32.366 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:08:16 -0400 (0:00:00.303) 0:21:32.669 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:08:16 -0400 (0:00:00.271) 0:21:32.941 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:08:16 -0400 (0:00:00.320) 0:21:33.262 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:08:17 -0400 (0:00:00.247) 0:21:33.509 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:08:17 -0400 (0:00:00.222) 0:21:33.732 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:08:17 -0400 (0:00:00.220) 0:21:33.952 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:08:17 -0400 (0:00:00.246) 0:21:34.199 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:08:17 -0400 (0:00:00.175) 0:21:34.375 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:08:18 -0400 (0:00:00.283) 0:21:34.658 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:08:18 -0400 (0:00:00.198) 0:21:34.857 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:08:18 -0400 (0:00:00.346) 0:21:35.203 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:08:19 -0400 (0:00:00.316) 0:21:35.520 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:08:20 -0400 (0:00:01.440) 0:21:36.960 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:08:22 -0400 (0:00:01.633) 0:21:38.594 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:08:22 -0400 (0:00:00.339) 0:21:38.934 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:08:22 -0400 (0:00:00.229) 0:21:39.163 *********** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:08:24 -0400 (0:00:01.554) 0:21:40.717 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:08:24 -0400 (0:00:00.251) 0:21:40.968 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:08:24 -0400 (0:00:00.259) 0:21:41.228 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:08:25 -0400 (0:00:00.229) 0:21:41.457 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:08:25 -0400 (0:00:00.263) 0:21:41.721 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:08:25 -0400 (0:00:00.202) 0:21:41.923 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:08:25 -0400 (0:00:00.210) 0:21:42.134 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:08:25 -0400 (0:00:00.271) 0:21:42.405 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:08:26 -0400 (0:00:00.236) 0:21:42.641 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:08:26 -0400 (0:00:00.285) 0:21:42.927 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:08:26 -0400 (0:00:00.275) 0:21:43.202 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:08:27 -0400 (0:00:00.247) 0:21:43.449 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:08:27 -0400 (0:00:00.297) 0:21:43.746 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:08:27 -0400 (0:00:00.329) 0:21:44.075 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:08:27 -0400 (0:00:00.275) 0:21:44.351 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:08:28 -0400 (0:00:00.276) 0:21:44.628 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:08:28 -0400 (0:00:00.307) 0:21:44.935 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:08:28 -0400 (0:00:00.249) 0:21:45.184 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:08:29 -0400 (0:00:00.311) 0:21:45.495 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:08:29 -0400 (0:00:00.288) 0:21:45.783 *********** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:08:29 -0400 (0:00:00.227) 0:21:46.011 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:08:29 -0400 (0:00:00.253) 0:21:46.264 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:08:30 -0400 (0:00:00.353) 0:21:46.617 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025569", "end": "2025-05-13 11:08:31.328720", "rc": 0, "start": "2025-05-13 11:08:31.303151" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:08:31 -0400 (0:00:01.440) 0:21:48.058 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:08:31 -0400 (0:00:00.310) 0:21:48.368 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:08:32 -0400 (0:00:00.271) 0:21:48.640 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:08:32 -0400 (0:00:00.213) 0:21:48.854 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:08:32 -0400 (0:00:00.249) 0:21:49.103 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:08:32 -0400 (0:00:00.220) 0:21:49.323 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:08:33 -0400 (0:00:00.153) 0:21:49.476 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:08:33 -0400 (0:00:00.151) 0:21:49.628 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:08:33 -0400 (0:00:00.258) 0:21:49.886 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Tuesday 13 May 2025 11:08:33 -0400 (0:00:00.155) 0:21:50.042 *********** changed: [managed-node8] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:468 Tuesday 13 May 2025 11:08:34 -0400 (0:00:01.368) 0:21:51.411 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node8 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Tuesday 13 May 2025 11:08:35 -0400 (0:00:00.377) 0:21:51.789 *********** ok: [managed-node8] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Tuesday 13 May 2025 11:08:35 -0400 (0:00:00.215) 0:21:52.004 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:08:35 -0400 (0:00:00.231) 0:21:52.235 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:08:36 -0400 (0:00:00.260) 0:21:52.495 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:08:36 -0400 (0:00:00.213) 0:21:52.708 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:08:36 -0400 (0:00:00.647) 0:21:53.356 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:08:37 -0400 (0:00:00.274) 0:21:53.630 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:08:37 -0400 (0:00:00.248) 0:21:53.879 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:08:37 -0400 (0:00:00.182) 0:21:54.061 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:08:37 -0400 (0:00:00.180) 0:21:54.242 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:08:38 -0400 (0:00:00.643) 0:21:54.886 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:08:43 -0400 (0:00:04.948) 0:21:59.834 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:08:43 -0400 (0:00:00.319) 0:22:00.154 *********** ok: [managed-node8] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:08:43 -0400 (0:00:00.260) 0:22:00.414 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:08:49 -0400 (0:00:05.833) 0:22:06.247 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:08:50 -0400 (0:00:00.380) 0:22:06.628 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:08:50 -0400 (0:00:00.105) 0:22:06.734 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:08:50 -0400 (0:00:00.219) 0:22:06.953 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:08:50 -0400 (0:00:00.175) 0:22:07.129 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:08:55 -0400 (0:00:04.816) 0:22:11.945 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service": { "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:08:58 -0400 (0:00:02.658) 0:22:14.604 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:08:58 -0400 (0:00:00.354) 0:22:14.959 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f464edab-7000-480c-9b25-609fde3e3013", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f464edab-7000-480c-9b25-609fde3e3013 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f464edab-7000-480c-9b25-609fde3e3013 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-05-13 11:04:12 EDT", "StateChangeTimestampMonotonic": "2810907941", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:09:02 -0400 (0:00:03.603) 0:22:18.562 *********** fatal: [managed-node8]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Tuesday 13 May 2025 11:09:07 -0400 (0:00:05.809) 0:22:24.371 *********** fatal: [managed-node8]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:09:08 -0400 (0:00:00.310) 0:22:24.682 *********** changed: [managed-node8] => (item=systemd-cryptsetup@luks\x2df464edab\x2d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df464edab\\x2d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node8] => (item=systemd-cryptsetup@luk...d7000\x2d480c\x2d9b25\x2d609fde3e3013.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "name": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7000\\x2d480c\\x2d9b25\\x2d609fde3e3013.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Tuesday 13 May 2025 11:09:11 -0400 (0:00:03.387) 0:22:28.070 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Tuesday 13 May 2025 11:09:11 -0400 (0:00:00.200) 0:22:28.271 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Tuesday 13 May 2025 11:09:12 -0400 (0:00:00.391) 0:22:28.663 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Tuesday 13 May 2025 11:09:12 -0400 (0:00:00.387) 0:22:29.050 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148914.6446226, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148914.6446226, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1747148914.6446226, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1848517172", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Tuesday 13 May 2025 11:09:13 -0400 (0:00:01.237) 0:22:30.288 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:491 Tuesday 13 May 2025 11:09:14 -0400 (0:00:00.209) 0:22:30.498 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:09:14 -0400 (0:00:00.638) 0:22:31.136 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:09:14 -0400 (0:00:00.221) 0:22:31.358 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:09:15 -0400 (0:00:00.810) 0:22:32.168 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:09:16 -0400 (0:00:00.512) 0:22:32.680 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:09:16 -0400 (0:00:00.285) 0:22:32.965 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:09:16 -0400 (0:00:00.228) 0:22:33.193 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:09:17 -0400 (0:00:00.319) 0:22:33.513 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:09:17 -0400 (0:00:00.222) 0:22:33.736 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:09:17 -0400 (0:00:00.589) 0:22:34.326 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:09:22 -0400 (0:00:04.746) 0:22:39.073 *********** ok: [managed-node8] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:09:22 -0400 (0:00:00.353) 0:22:39.426 *********** ok: [managed-node8] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:09:23 -0400 (0:00:00.332) 0:22:39.758 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:09:29 -0400 (0:00:05.870) 0:22:45.628 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:09:29 -0400 (0:00:00.453) 0:22:46.082 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:09:29 -0400 (0:00:00.166) 0:22:46.249 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:09:30 -0400 (0:00:00.254) 0:22:46.503 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:09:30 -0400 (0:00:00.193) 0:22:46.697 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:09:34 -0400 (0:00:04.722) 0:22:51.419 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:09:37 -0400 (0:00:02.931) 0:22:54.351 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:09:38 -0400 (0:00:00.242) 0:22:54.594 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:09:38 -0400 (0:00:00.202) 0:22:54.796 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 11:09:52 -0400 (0:00:13.902) 0:23:08.699 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 11:09:52 -0400 (0:00:00.290) 0:23:08.989 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148838.526506, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1747148838.523506, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747148838.523506, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 11:09:54 -0400 (0:00:01.733) 0:23:10.723 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:09:55 -0400 (0:00:01.587) 0:23:12.311 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 11:09:56 -0400 (0:00:00.172) 0:23:12.483 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 11:09:56 -0400 (0:00:00.291) 0:23:12.774 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 11:09:56 -0400 (0:00:00.344) 0:23:13.119 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 11:09:57 -0400 (0:00:00.315) 0:23:13.435 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 11:09:58 -0400 (0:00:01.513) 0:23:14.949 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 11:10:00 -0400 (0:00:01.877) 0:23:16.826 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 11:10:01 -0400 (0:00:01.406) 0:23:18.233 *********** skipping: [managed-node8] => (item={'src': '/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:10:02 -0400 (0:00:00.287) 0:23:18.520 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:10:03 -0400 (0:00:01.587) 0:23:20.108 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148850.2345238, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1747148843.127513, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 520094071, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1747148843.1255128, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1957659247", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:10:05 -0400 (0:00:01.729) 0:23:21.837 *********** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-84abe487-95a9-4263-9b50-0c8586c8c52c', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:10:07 -0400 (0:00:01.688) 0:23:23.526 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:507 Tuesday 13 May 2025 11:10:09 -0400 (0:00:01.931) 0:23:25.457 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:10:09 -0400 (0:00:00.595) 0:23:26.052 *********** ok: [managed-node8] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:10:09 -0400 (0:00:00.273) 0:23:26.326 *********** skipping: [managed-node8] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:10:10 -0400 (0:00:00.277) 0:23:26.604 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "84abe487-95a9-4263-9b50-0c8586c8c52c" }, "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "size": "4G", "type": "crypt", "uuid": "cdf7e35e-5d0b-4e08-9afa-237b7c89391c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:10:11 -0400 (0:00:01.606) 0:23:28.210 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002600", "end": "2025-05-13 11:10:12.902403", "rc": 0, "start": "2025-05-13 11:10:12.899803" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:10:13 -0400 (0:00:01.397) 0:23:29.607 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002612", "end": "2025-05-13 11:10:14.339215", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:10:14.336603" } STDOUT: luks-84abe487-95a9-4263-9b50-0c8586c8c52c /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:10:14 -0400 (0:00:01.465) 0:23:31.072 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node8 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Tuesday 13 May 2025 11:10:14 -0400 (0:00:00.347) 0:23:31.420 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Tuesday 13 May 2025 11:10:15 -0400 (0:00:00.125) 0:23:31.546 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023283", "end": "2025-05-13 11:10:16.349696", "rc": 0, "start": "2025-05-13 11:10:16.326413" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Tuesday 13 May 2025 11:10:16 -0400 (0:00:01.477) 0:23:33.023 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Tuesday 13 May 2025 11:10:16 -0400 (0:00:00.259) 0:23:33.283 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Tuesday 13 May 2025 11:10:17 -0400 (0:00:00.325) 0:23:33.608 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Tuesday 13 May 2025 11:10:17 -0400 (0:00:00.413) 0:23:34.022 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Tuesday 13 May 2025 11:10:19 -0400 (0:00:01.452) 0:23:35.474 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Tuesday 13 May 2025 11:10:19 -0400 (0:00:00.347) 0:23:35.821 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Tuesday 13 May 2025 11:10:19 -0400 (0:00:00.355) 0:23:36.176 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Tuesday 13 May 2025 11:10:20 -0400 (0:00:00.264) 0:23:36.441 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Tuesday 13 May 2025 11:10:20 -0400 (0:00:00.345) 0:23:36.786 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Tuesday 13 May 2025 11:10:20 -0400 (0:00:00.284) 0:23:37.071 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Tuesday 13 May 2025 11:10:20 -0400 (0:00:00.281) 0:23:37.353 *********** ok: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Tuesday 13 May 2025 11:10:21 -0400 (0:00:00.212) 0:23:37.565 *********** ok: [managed-node8] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.80 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Tuesday 13 May 2025 11:10:22 -0400 (0:00:01.545) 0:23:39.111 *********** skipping: [managed-node8] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Tuesday 13 May 2025 11:10:22 -0400 (0:00:00.319) 0:23:39.431 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node8 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Tuesday 13 May 2025 11:10:23 -0400 (0:00:00.524) 0:23:39.956 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Tuesday 13 May 2025 11:10:23 -0400 (0:00:00.285) 0:23:40.241 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Tuesday 13 May 2025 11:10:24 -0400 (0:00:00.214) 0:23:40.455 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Tuesday 13 May 2025 11:10:24 -0400 (0:00:00.652) 0:23:41.108 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Tuesday 13 May 2025 11:10:24 -0400 (0:00:00.189) 0:23:41.297 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Tuesday 13 May 2025 11:10:25 -0400 (0:00:00.274) 0:23:41.572 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Tuesday 13 May 2025 11:10:25 -0400 (0:00:00.237) 0:23:41.810 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Tuesday 13 May 2025 11:10:25 -0400 (0:00:00.186) 0:23:41.996 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Tuesday 13 May 2025 11:10:25 -0400 (0:00:00.245) 0:23:42.242 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Tuesday 13 May 2025 11:10:26 -0400 (0:00:00.249) 0:23:42.491 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Tuesday 13 May 2025 11:10:26 -0400 (0:00:00.258) 0:23:42.750 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Tuesday 13 May 2025 11:10:26 -0400 (0:00:00.256) 0:23:43.006 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node8 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Tuesday 13 May 2025 11:10:27 -0400 (0:00:00.666) 0:23:43.672 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node8 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Tuesday 13 May 2025 11:10:27 -0400 (0:00:00.551) 0:23:44.224 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Tuesday 13 May 2025 11:10:27 -0400 (0:00:00.174) 0:23:44.398 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Tuesday 13 May 2025 11:10:28 -0400 (0:00:00.287) 0:23:44.686 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Tuesday 13 May 2025 11:10:28 -0400 (0:00:00.144) 0:23:44.830 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Tuesday 13 May 2025 11:10:28 -0400 (0:00:00.072) 0:23:44.903 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Tuesday 13 May 2025 11:10:28 -0400 (0:00:00.098) 0:23:45.002 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Tuesday 13 May 2025 11:10:28 -0400 (0:00:00.269) 0:23:45.271 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Tuesday 13 May 2025 11:10:29 -0400 (0:00:00.286) 0:23:45.558 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node8 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Tuesday 13 May 2025 11:10:29 -0400 (0:00:00.520) 0:23:46.079 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node8 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Tuesday 13 May 2025 11:10:30 -0400 (0:00:00.493) 0:23:46.572 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Tuesday 13 May 2025 11:10:30 -0400 (0:00:00.252) 0:23:46.825 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Tuesday 13 May 2025 11:10:30 -0400 (0:00:00.279) 0:23:47.105 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Tuesday 13 May 2025 11:10:30 -0400 (0:00:00.327) 0:23:47.432 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Tuesday 13 May 2025 11:10:31 -0400 (0:00:00.228) 0:23:47.660 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node8 TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Tuesday 13 May 2025 11:10:31 -0400 (0:00:00.623) 0:23:48.284 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Tuesday 13 May 2025 11:10:32 -0400 (0:00:00.298) 0:23:48.582 *********** skipping: [managed-node8] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Tuesday 13 May 2025 11:10:32 -0400 (0:00:00.269) 0:23:48.852 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node8 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Tuesday 13 May 2025 11:10:32 -0400 (0:00:00.482) 0:23:49.334 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Tuesday 13 May 2025 11:10:33 -0400 (0:00:00.257) 0:23:49.592 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Tuesday 13 May 2025 11:10:33 -0400 (0:00:00.340) 0:23:49.932 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Tuesday 13 May 2025 11:10:33 -0400 (0:00:00.252) 0:23:50.184 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Tuesday 13 May 2025 11:10:34 -0400 (0:00:00.303) 0:23:50.488 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Tuesday 13 May 2025 11:10:34 -0400 (0:00:00.205) 0:23:50.694 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Tuesday 13 May 2025 11:10:34 -0400 (0:00:00.187) 0:23:50.881 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Tuesday 13 May 2025 11:10:34 -0400 (0:00:00.153) 0:23:51.034 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node8 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Tuesday 13 May 2025 11:10:35 -0400 (0:00:00.604) 0:23:51.638 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node8 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Tuesday 13 May 2025 11:10:35 -0400 (0:00:00.472) 0:23:52.111 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Tuesday 13 May 2025 11:10:35 -0400 (0:00:00.232) 0:23:52.344 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Tuesday 13 May 2025 11:10:36 -0400 (0:00:00.727) 0:23:53.071 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Tuesday 13 May 2025 11:10:36 -0400 (0:00:00.308) 0:23:53.380 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Tuesday 13 May 2025 11:10:37 -0400 (0:00:00.248) 0:23:53.629 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Tuesday 13 May 2025 11:10:37 -0400 (0:00:00.340) 0:23:53.969 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Tuesday 13 May 2025 11:10:37 -0400 (0:00:00.306) 0:23:54.276 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Tuesday 13 May 2025 11:10:38 -0400 (0:00:00.161) 0:23:54.437 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node8 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Tuesday 13 May 2025 11:10:38 -0400 (0:00:00.731) 0:23:55.169 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Tuesday 13 May 2025 11:10:39 -0400 (0:00:00.267) 0:23:55.436 *********** skipping: [managed-node8] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Tuesday 13 May 2025 11:10:39 -0400 (0:00:00.293) 0:23:55.729 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Tuesday 13 May 2025 11:10:39 -0400 (0:00:00.289) 0:23:56.019 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Tuesday 13 May 2025 11:10:39 -0400 (0:00:00.279) 0:23:56.299 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Tuesday 13 May 2025 11:10:40 -0400 (0:00:00.330) 0:23:56.630 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Tuesday 13 May 2025 11:10:40 -0400 (0:00:00.377) 0:23:57.007 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Tuesday 13 May 2025 11:10:40 -0400 (0:00:00.225) 0:23:57.232 *********** ok: [managed-node8] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Tuesday 13 May 2025 11:10:41 -0400 (0:00:00.321) 0:23:57.554 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:10:41 -0400 (0:00:00.380) 0:23:57.934 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:10:41 -0400 (0:00:00.252) 0:23:58.187 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:10:42 -0400 (0:00:01.047) 0:23:59.234 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:10:43 -0400 (0:00:00.363) 0:23:59.598 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:10:43 -0400 (0:00:00.324) 0:23:59.922 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:10:43 -0400 (0:00:00.287) 0:24:00.209 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:10:44 -0400 (0:00:00.247) 0:24:00.456 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:10:44 -0400 (0:00:00.159) 0:24:00.616 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:10:44 -0400 (0:00:00.173) 0:24:00.790 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:10:44 -0400 (0:00:00.183) 0:24:00.973 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:10:44 -0400 (0:00:00.220) 0:24:01.194 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:10:45 -0400 (0:00:00.310) 0:24:01.504 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:10:45 -0400 (0:00:00.318) 0:24:01.823 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:10:45 -0400 (0:00:00.226) 0:24:02.050 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:10:46 -0400 (0:00:00.456) 0:24:02.506 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:10:46 -0400 (0:00:00.266) 0:24:02.773 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:10:46 -0400 (0:00:00.310) 0:24:03.084 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:10:46 -0400 (0:00:00.240) 0:24:03.325 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:10:47 -0400 (0:00:00.342) 0:24:03.667 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:10:47 -0400 (0:00:00.239) 0:24:03.906 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:10:47 -0400 (0:00:00.304) 0:24:04.211 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:10:48 -0400 (0:00:00.738) 0:24:04.950 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148991.77474, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148991.77474, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 246576, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148991.77474, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:10:49 -0400 (0:00:01.452) 0:24:06.402 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:10:50 -0400 (0:00:00.201) 0:24:06.604 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:10:50 -0400 (0:00:00.232) 0:24:06.837 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:10:50 -0400 (0:00:00.328) 0:24:07.166 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:10:50 -0400 (0:00:00.208) 0:24:07.374 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:10:51 -0400 (0:00:00.164) 0:24:07.539 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:10:51 -0400 (0:00:00.263) 0:24:07.802 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747148991.9207401, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747148991.9207401, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 263124, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1747148991.9207401, "nlink": 1, "path": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:10:52 -0400 (0:00:01.570) 0:24:09.372 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:10:57 -0400 (0:00:04.393) 0:24:13.766 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010203", "end": "2025-05-13 11:10:58.356072", "rc": 0, "start": "2025-05-13 11:10:58.345869" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 84abe487-95a9-4263-9b50-0c8586c8c52c Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 933875 Threads: 2 Salt: 3d 89 6c c0 bc f3 24 b6 97 2c 3f b5 06 87 37 29 4c 7d 75 1c d3 11 65 16 cf 3e 13 83 63 5c 6a 8d AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 1b dd 2c 32 6c e1 61 cf e5 8d ff 25 a3 6d 93 14 b9 77 df 1d f1 bd 3e 73 92 03 20 7b ce 4e a2 01 Digest: b1 54 1b 3a 9a ca 68 36 16 50 fe c6 e6 42 fe 88 9f af 5a 53 6a 07 10 a9 77 01 d3 4e 5f ee da 0d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:10:58 -0400 (0:00:01.287) 0:24:15.054 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:10:58 -0400 (0:00:00.244) 0:24:15.298 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:10:59 -0400 (0:00:00.496) 0:24:15.795 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:10:59 -0400 (0:00:00.295) 0:24:16.091 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:10:59 -0400 (0:00:00.327) 0:24:16.418 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:11:00 -0400 (0:00:00.216) 0:24:16.635 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:11:00 -0400 (0:00:00.241) 0:24:16.876 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:11:00 -0400 (0:00:00.209) 0:24:17.086 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-84abe487-95a9-4263-9b50-0c8586c8c52c /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:11:00 -0400 (0:00:00.333) 0:24:17.419 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:11:01 -0400 (0:00:00.251) 0:24:17.670 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:11:01 -0400 (0:00:00.372) 0:24:18.042 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:11:01 -0400 (0:00:00.372) 0:24:18.415 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:11:02 -0400 (0:00:00.318) 0:24:18.733 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:11:02 -0400 (0:00:00.187) 0:24:18.921 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:11:02 -0400 (0:00:00.179) 0:24:19.100 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:11:02 -0400 (0:00:00.183) 0:24:19.284 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:11:03 -0400 (0:00:00.267) 0:24:19.552 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:11:03 -0400 (0:00:00.229) 0:24:19.782 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:11:03 -0400 (0:00:00.193) 0:24:19.976 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:11:03 -0400 (0:00:00.180) 0:24:20.157 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:11:03 -0400 (0:00:00.241) 0:24:20.399 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:11:04 -0400 (0:00:00.268) 0:24:20.667 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:11:04 -0400 (0:00:00.207) 0:24:20.874 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:11:04 -0400 (0:00:00.259) 0:24:21.134 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:11:06 -0400 (0:00:01.418) 0:24:22.553 *********** ok: [managed-node8] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:11:07 -0400 (0:00:01.345) 0:24:23.898 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:11:07 -0400 (0:00:00.252) 0:24:24.151 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:11:07 -0400 (0:00:00.133) 0:24:24.284 *********** ok: [managed-node8] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:11:09 -0400 (0:00:01.312) 0:24:25.596 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:11:09 -0400 (0:00:00.229) 0:24:25.826 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:11:09 -0400 (0:00:00.206) 0:24:26.032 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:11:09 -0400 (0:00:00.187) 0:24:26.219 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:11:10 -0400 (0:00:00.263) 0:24:26.483 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:11:10 -0400 (0:00:00.155) 0:24:26.639 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:11:10 -0400 (0:00:00.200) 0:24:26.839 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:11:10 -0400 (0:00:00.270) 0:24:27.110 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:11:10 -0400 (0:00:00.260) 0:24:27.371 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:11:11 -0400 (0:00:00.252) 0:24:27.623 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:11:11 -0400 (0:00:00.308) 0:24:27.932 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:11:11 -0400 (0:00:00.199) 0:24:28.131 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:11:11 -0400 (0:00:00.182) 0:24:28.313 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:11:12 -0400 (0:00:00.749) 0:24:29.063 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:11:12 -0400 (0:00:00.227) 0:24:29.290 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:11:13 -0400 (0:00:00.211) 0:24:29.502 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:11:13 -0400 (0:00:00.216) 0:24:29.718 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:11:13 -0400 (0:00:00.221) 0:24:29.940 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:11:13 -0400 (0:00:00.162) 0:24:30.103 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:11:13 -0400 (0:00:00.213) 0:24:30.316 *********** ok: [managed-node8] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:11:14 -0400 (0:00:00.238) 0:24:30.555 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:11:14 -0400 (0:00:00.204) 0:24:30.760 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:11:14 -0400 (0:00:00.313) 0:24:31.073 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023180", "end": "2025-05-13 11:11:15.780470", "rc": 0, "start": "2025-05-13 11:11:15.757290" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:11:16 -0400 (0:00:01.367) 0:24:32.441 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:11:16 -0400 (0:00:00.205) 0:24:32.646 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:11:16 -0400 (0:00:00.213) 0:24:32.860 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:11:16 -0400 (0:00:00.218) 0:24:33.078 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:11:16 -0400 (0:00:00.148) 0:24:33.227 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:11:16 -0400 (0:00:00.157) 0:24:33.385 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:11:17 -0400 (0:00:00.192) 0:24:33.577 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:11:17 -0400 (0:00:00.220) 0:24:33.798 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:11:17 -0400 (0:00:00.094) 0:24:33.892 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:510 Tuesday 13 May 2025 11:11:17 -0400 (0:00:00.097) 0:24:33.990 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 13 May 2025 11:11:17 -0400 (0:00:00.412) 0:24:34.403 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 13 May 2025 11:11:18 -0400 (0:00:00.195) 0:24:34.598 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 13 May 2025 11:11:18 -0400 (0:00:00.178) 0:24:34.776 *********** skipping: [managed-node8] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node8] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node8] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 13 May 2025 11:11:18 -0400 (0:00:00.438) 0:24:35.215 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 13 May 2025 11:11:19 -0400 (0:00:00.239) 0:24:35.454 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 13 May 2025 11:11:19 -0400 (0:00:00.267) 0:24:35.722 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 13 May 2025 11:11:19 -0400 (0:00:00.171) 0:24:35.893 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 13 May 2025 11:11:19 -0400 (0:00:00.187) 0:24:36.081 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 13 May 2025 11:11:19 -0400 (0:00:00.339) 0:24:36.421 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 13 May 2025 11:11:24 -0400 (0:00:04.511) 0:24:40.933 *********** ok: [managed-node8] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 13 May 2025 11:11:24 -0400 (0:00:00.193) 0:24:41.126 *********** ok: [managed-node8] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 13 May 2025 11:11:24 -0400 (0:00:00.299) 0:24:41.426 *********** ok: [managed-node8] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 13 May 2025 11:11:30 -0400 (0:00:05.606) 0:24:47.032 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node8 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 13 May 2025 11:11:31 -0400 (0:00:00.483) 0:24:47.515 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 13 May 2025 11:11:31 -0400 (0:00:00.205) 0:24:47.721 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 13 May 2025 11:11:31 -0400 (0:00:00.255) 0:24:47.976 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 13 May 2025 11:11:31 -0400 (0:00:00.195) 0:24:48.172 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 13 May 2025 11:11:36 -0400 (0:00:04.794) 0:24:52.967 *********** ok: [managed-node8] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 13 May 2025 11:11:40 -0400 (0:00:03.617) 0:24:56.584 *********** ok: [managed-node8] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 13 May 2025 11:11:40 -0400 (0:00:00.274) 0:24:56.859 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 13 May 2025 11:11:40 -0400 (0:00:00.097) 0:24:56.957 *********** changed: [managed-node8] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 13 May 2025 11:11:46 -0400 (0:00:05.906) 0:25:02.863 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 13 May 2025 11:11:46 -0400 (0:00:00.165) 0:25:03.029 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747149001.4337547, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a0d0301f4ae81731a3d4f2c4e226d6e0aeafdd27", "ctime": 1747149001.4307547, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 362807433, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1747149001.4307547, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3624722128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 13 May 2025 11:11:47 -0400 (0:00:01.351) 0:25:04.380 *********** ok: [managed-node8] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 13 May 2025 11:11:49 -0400 (0:00:01.704) 0:25:06.084 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 13 May 2025 11:11:49 -0400 (0:00:00.136) 0:25:06.221 *********** ok: [managed-node8] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 13 May 2025 11:11:49 -0400 (0:00:00.189) 0:25:06.410 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 13 May 2025 11:11:50 -0400 (0:00:00.257) 0:25:06.668 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 13 May 2025 11:11:50 -0400 (0:00:00.132) 0:25:06.800 *********** changed: [managed-node8] => (item={'src': '/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-84abe487-95a9-4263-9b50-0c8586c8c52c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 13 May 2025 11:11:51 -0400 (0:00:01.219) 0:25:08.019 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 13 May 2025 11:11:53 -0400 (0:00:01.752) 0:25:09.772 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 13 May 2025 11:11:53 -0400 (0:00:00.295) 0:25:10.068 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 13 May 2025 11:11:53 -0400 (0:00:00.184) 0:25:10.252 *********** ok: [managed-node8] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 13 May 2025 11:11:55 -0400 (0:00:01.675) 0:25:11.927 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747149014.3377743, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b5ddc5668d43f4d330efcd14d2fac1fca6ed8949", "ctime": 1747149006.7557628, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 148897989, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1747149006.7537627, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2453802545", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 13 May 2025 11:11:57 -0400 (0:00:01.876) 0:25:13.804 *********** changed: [managed-node8] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-84abe487-95a9-4263-9b50-0c8586c8c52c', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-84abe487-95a9-4263-9b50-0c8586c8c52c", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 13 May 2025 11:11:59 -0400 (0:00:01.774) 0:25:15.579 *********** ok: [managed-node8] TASK [Verify role results] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:520 Tuesday 13 May 2025 11:12:01 -0400 (0:00:01.976) 0:25:17.555 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node8 TASK [Print out pool information] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Tuesday 13 May 2025 11:12:01 -0400 (0:00:00.406) 0:25:17.962 *********** skipping: [managed-node8] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Tuesday 13 May 2025 11:12:01 -0400 (0:00:00.276) 0:25:18.238 *********** ok: [managed-node8] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=iINftm-y3A5-jdNp-Swd3-SUr7-cdJf-M8yk4G", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Tuesday 13 May 2025 11:12:02 -0400 (0:00:00.261) 0:25:18.500 *********** ok: [managed-node8] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Tuesday 13 May 2025 11:12:03 -0400 (0:00:01.021) 0:25:19.521 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002678", "end": "2025-05-13 11:12:04.002255", "rc": 0, "start": "2025-05-13 11:12:03.999577" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Tuesday 13 May 2025 11:12:04 -0400 (0:00:01.184) 0:25:20.706 *********** ok: [managed-node8] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002705", "end": "2025-05-13 11:12:05.494017", "failed_when_result": false, "rc": 0, "start": "2025-05-13 11:12:05.491312" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Tuesday 13 May 2025 11:12:05 -0400 (0:00:01.519) 0:25:22.226 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Tuesday 13 May 2025 11:12:05 -0400 (0:00:00.129) 0:25:22.355 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node8 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Tuesday 13 May 2025 11:12:06 -0400 (0:00:00.396) 0:25:22.752 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Tuesday 13 May 2025 11:12:07 -0400 (0:00:00.889) 0:25:23.642 *********** included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node8 included: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node8 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Tuesday 13 May 2025 11:12:08 -0400 (0:00:01.287) 0:25:24.929 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Tuesday 13 May 2025 11:12:08 -0400 (0:00:00.237) 0:25:25.166 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Tuesday 13 May 2025 11:12:09 -0400 (0:00:00.335) 0:25:25.502 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Tuesday 13 May 2025 11:12:09 -0400 (0:00:00.205) 0:25:25.708 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Tuesday 13 May 2025 11:12:09 -0400 (0:00:00.187) 0:25:25.895 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Tuesday 13 May 2025 11:12:09 -0400 (0:00:00.238) 0:25:26.133 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Tuesday 13 May 2025 11:12:09 -0400 (0:00:00.298) 0:25:26.431 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Tuesday 13 May 2025 11:12:10 -0400 (0:00:00.309) 0:25:26.741 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Tuesday 13 May 2025 11:12:10 -0400 (0:00:00.279) 0:25:27.020 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Tuesday 13 May 2025 11:12:10 -0400 (0:00:00.199) 0:25:27.220 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Tuesday 13 May 2025 11:12:11 -0400 (0:00:00.227) 0:25:27.448 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Tuesday 13 May 2025 11:12:11 -0400 (0:00:00.232) 0:25:27.681 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Tuesday 13 May 2025 11:12:11 -0400 (0:00:00.458) 0:25:28.139 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Tuesday 13 May 2025 11:12:11 -0400 (0:00:00.246) 0:25:28.386 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Tuesday 13 May 2025 11:12:12 -0400 (0:00:00.291) 0:25:28.678 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Tuesday 13 May 2025 11:12:12 -0400 (0:00:00.273) 0:25:28.952 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Tuesday 13 May 2025 11:12:12 -0400 (0:00:00.291) 0:25:29.243 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Tuesday 13 May 2025 11:12:13 -0400 (0:00:00.265) 0:25:29.509 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Tuesday 13 May 2025 11:12:13 -0400 (0:00:00.287) 0:25:29.796 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Tuesday 13 May 2025 11:12:13 -0400 (0:00:00.319) 0:25:30.116 *********** ok: [managed-node8] => { "changed": false, "stat": { "atime": 1747149106.095914, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1747149106.095914, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36559, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1747149106.095914, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Tuesday 13 May 2025 11:12:15 -0400 (0:00:01.433) 0:25:31.550 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Tuesday 13 May 2025 11:12:15 -0400 (0:00:00.126) 0:25:31.677 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Tuesday 13 May 2025 11:12:15 -0400 (0:00:00.148) 0:25:31.826 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Tuesday 13 May 2025 11:12:15 -0400 (0:00:00.176) 0:25:32.002 *********** ok: [managed-node8] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Tuesday 13 May 2025 11:12:15 -0400 (0:00:00.193) 0:25:32.195 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Tuesday 13 May 2025 11:12:15 -0400 (0:00:00.158) 0:25:32.354 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Tuesday 13 May 2025 11:12:16 -0400 (0:00:00.165) 0:25:32.519 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Tuesday 13 May 2025 11:12:16 -0400 (0:00:00.141) 0:25:32.660 *********** ok: [managed-node8] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Tuesday 13 May 2025 11:12:20 -0400 (0:00:04.227) 0:25:36.888 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Tuesday 13 May 2025 11:12:20 -0400 (0:00:00.132) 0:25:37.021 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Tuesday 13 May 2025 11:12:20 -0400 (0:00:00.231) 0:25:37.252 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Tuesday 13 May 2025 11:12:20 -0400 (0:00:00.151) 0:25:37.404 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Tuesday 13 May 2025 11:12:21 -0400 (0:00:00.236) 0:25:37.641 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Tuesday 13 May 2025 11:12:21 -0400 (0:00:00.230) 0:25:37.872 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Tuesday 13 May 2025 11:12:21 -0400 (0:00:00.196) 0:25:38.068 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Tuesday 13 May 2025 11:12:21 -0400 (0:00:00.179) 0:25:38.248 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Tuesday 13 May 2025 11:12:21 -0400 (0:00:00.143) 0:25:38.391 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Tuesday 13 May 2025 11:12:22 -0400 (0:00:00.260) 0:25:38.652 *********** ok: [managed-node8] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Tuesday 13 May 2025 11:12:22 -0400 (0:00:00.293) 0:25:38.946 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Tuesday 13 May 2025 11:12:22 -0400 (0:00:00.198) 0:25:39.144 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Tuesday 13 May 2025 11:12:23 -0400 (0:00:00.331) 0:25:39.476 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Tuesday 13 May 2025 11:12:23 -0400 (0:00:00.297) 0:25:39.773 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Tuesday 13 May 2025 11:12:23 -0400 (0:00:00.103) 0:25:39.877 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Tuesday 13 May 2025 11:12:23 -0400 (0:00:00.201) 0:25:40.079 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Tuesday 13 May 2025 11:12:23 -0400 (0:00:00.214) 0:25:40.293 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Tuesday 13 May 2025 11:12:24 -0400 (0:00:00.259) 0:25:40.553 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Tuesday 13 May 2025 11:12:24 -0400 (0:00:00.249) 0:25:40.803 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Tuesday 13 May 2025 11:12:24 -0400 (0:00:00.232) 0:25:41.035 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Tuesday 13 May 2025 11:12:24 -0400 (0:00:00.203) 0:25:41.239 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Tuesday 13 May 2025 11:12:24 -0400 (0:00:00.142) 0:25:41.381 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Tuesday 13 May 2025 11:12:25 -0400 (0:00:00.197) 0:25:41.579 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Tuesday 13 May 2025 11:12:25 -0400 (0:00:00.135) 0:25:41.715 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Tuesday 13 May 2025 11:12:25 -0400 (0:00:00.169) 0:25:41.885 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Tuesday 13 May 2025 11:12:25 -0400 (0:00:00.161) 0:25:42.046 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Tuesday 13 May 2025 11:12:25 -0400 (0:00:00.197) 0:25:42.243 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.248) 0:25:42.491 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.247) 0:25:42.739 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.210) 0:25:42.949 *********** skipping: [managed-node8] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.153) 0:25:43.103 *********** skipping: [managed-node8] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.122) 0:25:43.225 *********** skipping: [managed-node8] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.110) 0:25:43.335 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Tuesday 13 May 2025 11:12:26 -0400 (0:00:00.086) 0:25:43.422 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Tuesday 13 May 2025 11:12:27 -0400 (0:00:00.206) 0:25:43.629 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Tuesday 13 May 2025 11:12:27 -0400 (0:00:00.163) 0:25:43.792 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Tuesday 13 May 2025 11:12:27 -0400 (0:00:00.148) 0:25:43.941 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Tuesday 13 May 2025 11:12:27 -0400 (0:00:00.344) 0:25:44.286 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Tuesday 13 May 2025 11:12:27 -0400 (0:00:00.118) 0:25:44.404 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.117) 0:25:44.521 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.196) 0:25:44.718 *********** skipping: [managed-node8] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.143) 0:25:44.862 *********** skipping: [managed-node8] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.153) 0:25:45.015 *********** skipping: [managed-node8] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.149) 0:25:45.164 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Tuesday 13 May 2025 11:12:28 -0400 (0:00:00.171) 0:25:45.336 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Tuesday 13 May 2025 11:12:29 -0400 (0:00:00.201) 0:25:45.537 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Tuesday 13 May 2025 11:12:29 -0400 (0:00:00.221) 0:25:45.759 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Tuesday 13 May 2025 11:12:29 -0400 (0:00:00.205) 0:25:45.964 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Tuesday 13 May 2025 11:12:29 -0400 (0:00:00.286) 0:25:46.251 *********** ok: [managed-node8] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.256) 0:25:46.508 *********** ok: [managed-node8] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.217) 0:25:46.725 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.203) 0:25:46.929 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.134) 0:25:47.064 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.178) 0:25:47.243 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Tuesday 13 May 2025 11:12:30 -0400 (0:00:00.167) 0:25:47.410 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Tuesday 13 May 2025 11:12:31 -0400 (0:00:00.223) 0:25:47.634 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Tuesday 13 May 2025 11:12:31 -0400 (0:00:00.132) 0:25:47.766 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Tuesday 13 May 2025 11:12:31 -0400 (0:00:00.213) 0:25:47.980 *********** skipping: [managed-node8] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Tuesday 13 May 2025 11:12:31 -0400 (0:00:00.248) 0:25:48.229 *********** ok: [managed-node8] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Tuesday 13 May 2025 11:12:32 -0400 (0:00:00.256) 0:25:48.485 *********** ok: [managed-node8] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node8 : ok=1224 changed=60 unreachable=0 failed=9 skipped=1073 rescued=9 ignored=0 Tuesday 13 May 2025 11:12:32 -0400 (0:00:00.164) 0:25:48.650 *********** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.23s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.90s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.74s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.29s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.18s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.39s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.37s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.19s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.95s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.91s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.87s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.83s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.81s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.80s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.80s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.77s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.72s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.72s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.68s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.61s /tmp/collections-nVK/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19