ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_auditd_integration.yml ***************************************** 1 plays in /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml PLAY [Integration test for linux-system-roles.auditd] ************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:4 Wednesday 22 April 2026 08:48:43 -0400 (0:00:00.019) 0:00:00.019 ******* ok: [managed-node1] META: ran handlers TASK [Back up existing auditd configuration and rules] ************************* task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:9 Wednesday 22 April 2026 08:48:44 -0400 (0:00:01.010) 0:00:01.030 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml for managed-node1 TASK [Check for existing auditd.conf] ****************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:7 Wednesday 22 April 2026 08:48:44 -0400 (0:00:00.017) 0:00:01.047 ******* ok: [managed-node1] => { "changed": false, "stat": { "atime": 1776861959.129, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "92826a0a4167a0ec9df5fc3f743dd551e25be805", "ctime": 1716968741.879, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4939770, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0640", "mtime": 1699258415.0, "nlink": 1, "path": "/etc/audit/auditd.conf", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 882, "uid": 0, "version": "3564651217", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Check for existing custom.rules] ***************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:12 Wednesday 22 April 2026 08:48:45 -0400 (0:00:00.437) 0:00:01.484 ******* ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Remember whether custom.rules existed] *********************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:17 Wednesday 22 April 2026 08:48:45 -0400 (0:00:00.317) 0:00:01.802 ******* ok: [managed-node1] => { "ansible_facts": { "__auditd_integration_had_custom_rules": false }, "changed": false } TASK [Back up auditd.conf before test] ***************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:21 Wednesday 22 April 2026 08:48:45 -0400 (0:00:00.034) 0:00:01.836 ******* changed: [managed-node1] => { "changed": true, "checksum": "92826a0a4167a0ec9df5fc3f743dd551e25be805", "dest": "/root/.lsr_auditd_integration_auditd.conf.bak", "gid": 0, "group": "root", "md5sum": "fd5c639b8b1bd57c486dab75985ad9af", "mode": "0640", "owner": "root", "secontext": "system_u:object_r:admin_home_t:s0", "size": 882, "src": "/etc/audit/auditd.conf", "state": "file", "uid": 0 } TASK [Back up custom.rules before test] **************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:29 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.442) 0:00:02.279 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Record auditd.conf backup path for cleanup] ****************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:37 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.031) 0:00:02.310 ******* ok: [managed-node1] => { "ansible_facts": { "__auditd_integration_backup_auditd_conf": "/root/.lsr_auditd_integration_auditd.conf.bak" }, "changed": false } TASK [Record custom.rules backup path for cleanup] ***************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:42 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.033) 0:00:02.344 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set filter based on platform/version] ************************************ task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:12 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.031) 0:00:02.375 ******* ok: [managed-node1] => { "ansible_facts": { "__filter": "exit" }, "changed": false } TASK [Run auditd role with non-default settings] ******************************* task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:16 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.033) 0:00:02.408 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/run_role_with_clear_facts.yml for managed-node1 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/run_role_with_clear_facts.yml:23 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.019) 0:00:02.428 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/run_role_with_clear_facts.yml:33 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.031) 0:00:02.459 ******* TASK [fedora.linux_system_roles.auditd : Set platform/version specific variables] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/main.yml:3 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.064) 0:00:02.524 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.auditd : Ensure ansible_facts used by role] **** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:2 Wednesday 22 April 2026 08:48:46 -0400 (0:00:00.019) 0:00:02.543 ******* ok: [managed-node1] TASK [fedora.linux_system_roles.auditd : Check if system is ostree] ************ task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:10 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.695) 0:00:03.238 ******* ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.auditd : Set flag to indicate system is ostree] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:15 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.317) 0:00:03.556 ******* ok: [managed-node1] => { "ansible_facts": { "__auditd_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.auditd : Set platform/version specific variables] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:19 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.017) 0:00:03.574 ******* skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_8.yml) => { "ansible_facts": { "__auditd_rules_filters": [ "exclude", "exit", "filesystem", "task", "user" ] }, "ansible_included_var_files": [ "/tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node1] => (item=CentOS_8.yml) => { "ansible_facts": { "__auditd_rules_filters": [ "exclude", "exit", "filesystem", "task", "user" ] }, "ansible_included_var_files": [ "/tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.auditd : Resolve package names for OS family] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:34 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.040) 0:00:03.614 ******* ok: [managed-node1] => { "ansible_facts": { "__auditd_packages": [ "audit" ] }, "changed": false } TASK [fedora.linux_system_roles.auditd : Validate role parameters] ************* task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/main.yml:8 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.014) 0:00:03.629 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml for managed-node1 TASK [fedora.linux_system_roles.auditd : Assert num_logs range (num_logs_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:4 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.026) 0:00:03.656 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert freq range (freq_parser)] ****** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:11 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.671 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert incremental flush requires non-zero freq (sanity_check)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:18 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.016) 0:00:03.688 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert priority_boost range (priority_boost_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:25 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.014) 0:00:03.703 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert q_depth range (q_depth_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:31 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.016) 0:00:03.719 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_maximum_rate is null or a non-negative integer] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:38 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.735 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_backlog_wait_time is null or a non-negative integer] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:48 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.751 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert max_restarts range (max_restarts_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:58 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.767 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_listen_port range when listener enabled in build (tcp_listen_port_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:64 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.014) 0:00:03.782 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_listen_queue range (tcp_listen_queue_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:70 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.797 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_max_per_addr range (tcp_max_per_addr_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:76 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.813 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_client_max_idle range (tcp_client_max_idle_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:82 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.828 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_client_ports format (tcp_client_ports_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:88 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.844 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert tcp_client_ports range order] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:98 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.017) 0:00:03.862 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert space_left_action rejects halt (space_action_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:107 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.018) 0:00:03.880 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert space_left percentage is between 1 and 99 when given as N%] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:113 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.015) 0:00:03.896 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert admin_space_left percentage is between 1 and 99 when given as N%] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:125 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.013) 0:00:03.909 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert space_left is greater than admin_space_left when both use same form] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:137 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.012) 0:00:03.922 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert disk_full_action rejects email (disk_full_action_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:161 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.020) 0:00:03.942 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert disk_error_action rejects email and rotate (disk_error_action_parser)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:167 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.014) 0:00:03.957 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert exec companion paths when action is exec] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:173 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.014) 0:00:03.971 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert space_left_action exec path] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:180 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.013) 0:00:03.984 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert admin_space_left_action exec path] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:187 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.011) 0:00:03.996 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert disk_full_action exec path] **** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:194 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.013) 0:00:04.009 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert disk_error_action exec path] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:201 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.012) 0:00:04.022 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert name when name_format is user (resolve_node)] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:208 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.012) 0:00:04.034 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Validate auditd_rules structure and values] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:215 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.016) 0:00:04.050 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml for managed-node1 TASK [fedora.linux_system_roles.auditd : Assert auditd_rules is a list of dicts] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:4 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.035) 0:00:04.086 ******* ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert each auditd_rules entry has required action and filter] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:13 Wednesday 22 April 2026 08:48:47 -0400 (0:00:00.023) 0:00:04.110 ******* ok: [managed-node1] => (item=never,filesystem) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "never", "field": "fstype=debugfs", "filter": "filesystem", "keyname": "lsr_fs_dbg" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "filter": "exit", "keyname": "lsr_io_uring", "syscall": [ "openat", "openat2" ] } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "nowarn", "filter": "exit", "keyname": "lsr_arch_nw", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": [ "b32", "b64" ], "filter": "exit", "keyname": "lsr_arch_pair", "syscall": "read" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_path_nw", "path": "/etc/hosts", "permission": "nowarn", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/etc/ssh", "filter": "exit", "keyname": "lsr_dir_one_perm", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/tmp", "filter": "exit", "keyname": "lsr_dir_many_perm", "permission": [ "read", "write" ], "syscall": "openat" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_sc_list", "syscall": [ "open", "close" ] } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "field": [ "success=1", "auid!=-1", "exit<-1", "gid>10", "euid<=65534", "pid>=0", "a1&0100", "a2&=0200" ], "filter": "exit", "keyname": [ "lsr_ops", "lsr_ops2", "lsr_ops3" ], "path": "/etc/issue", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item=always,exit) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": [ "kk1", "kk2" ], "syscall": "dup" } } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_rules entries use only supported keys] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:29 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.131) 0:00:04.242 ******* ok: [managed-node1] => (item={'action': 'never', 'filter': 'filesystem', 'field': 'fstype=debugfs', 'keyname': 'lsr_fs_dbg'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "never", "field": "fstype=debugfs", "filter": "filesystem", "keyname": "lsr_fs_dbg" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'syscall': ['openat', 'openat2'], 'keyname': 'lsr_io_uring'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "filter": "exit", "keyname": "lsr_io_uring", "syscall": [ "openat", "openat2" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'nowarn', 'syscall': 'open', 'keyname': 'lsr_arch_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "nowarn", "filter": "exit", "keyname": "lsr_arch_nw", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': ['b32', 'b64'], 'syscall': 'read', 'keyname': 'lsr_arch_pair'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": [ "b32", "b64" ], "filter": "exit", "keyname": "lsr_arch_pair", "syscall": "read" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/hosts', 'permission': 'nowarn', 'keyname': 'lsr_path_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_path_nw", "path": "/etc/hosts", "permission": "nowarn", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'dir': '/etc/ssh', 'permission': 'read', 'keyname': 'lsr_dir_one_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/etc/ssh", "filter": "exit", "keyname": "lsr_dir_one_perm", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'openat', 'dir': '/tmp', 'permission': ['read', 'write'], 'keyname': 'lsr_dir_many_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/tmp", "filter": "exit", "keyname": "lsr_dir_many_perm", "permission": [ "read", "write" ], "syscall": "openat" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': ['open', 'close'], 'keyname': 'lsr_sc_list'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_sc_list", "syscall": [ "open", "close" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/issue', 'permission': 'read', 'field': ['success=1', 'auid!=-1', 'exit<-1', 'gid>10', 'euid<=65534', 'pid>=0', 'a1&0100', 'a2&=0200'], 'keyname': ['lsr_ops', 'lsr_ops2', 'lsr_ops3']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "field": [ "success=1", "auid!=-1", "exit<-1", "gid>10", "euid<=65534", "pid>=0", "a1&0100", "a2&=0200" ], "filter": "exit", "keyname": [ "lsr_ops", "lsr_ops2", "lsr_ops3" ], "path": "/etc/issue", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'dup', 'keyname': ['kk1', 'kk2']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": [ "kk1", "kk2" ], "syscall": "dup" } } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_rules path and dir are mutually exclusive] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:49 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.085) 0:00:04.327 ******* ok: [managed-node1] => (item={'action': 'never', 'filter': 'filesystem', 'field': 'fstype=debugfs', 'keyname': 'lsr_fs_dbg'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "never", "field": "fstype=debugfs", "filter": "filesystem", "keyname": "lsr_fs_dbg" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'syscall': ['openat', 'openat2'], 'keyname': 'lsr_io_uring'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "filter": "exit", "keyname": "lsr_io_uring", "syscall": [ "openat", "openat2" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'nowarn', 'syscall': 'open', 'keyname': 'lsr_arch_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "nowarn", "filter": "exit", "keyname": "lsr_arch_nw", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': ['b32', 'b64'], 'syscall': 'read', 'keyname': 'lsr_arch_pair'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": [ "b32", "b64" ], "filter": "exit", "keyname": "lsr_arch_pair", "syscall": "read" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/hosts', 'permission': 'nowarn', 'keyname': 'lsr_path_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_path_nw", "path": "/etc/hosts", "permission": "nowarn", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'dir': '/etc/ssh', 'permission': 'read', 'keyname': 'lsr_dir_one_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/etc/ssh", "filter": "exit", "keyname": "lsr_dir_one_perm", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'openat', 'dir': '/tmp', 'permission': ['read', 'write'], 'keyname': 'lsr_dir_many_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/tmp", "filter": "exit", "keyname": "lsr_dir_many_perm", "permission": [ "read", "write" ], "syscall": "openat" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': ['open', 'close'], 'keyname': 'lsr_sc_list'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_sc_list", "syscall": [ "open", "close" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/issue', 'permission': 'read', 'field': ['success=1', 'auid!=-1', 'exit<-1', 'gid>10', 'euid<=65534', 'pid>=0', 'a1&0100', 'a2&=0200'], 'keyname': ['lsr_ops', 'lsr_ops2', 'lsr_ops3']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "field": [ "success=1", "auid!=-1", "exit<-1", "gid>10", "euid<=65534", "pid>=0", "a1&0100", "a2&=0200" ], "filter": "exit", "keyname": [ "lsr_ops", "lsr_ops2", "lsr_ops3" ], "path": "/etc/issue", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'dup', 'keyname': ['kk1', 'kk2']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": [ "kk1", "kk2" ], "syscall": "dup" } } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_rules path or dir requires filter exit] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:56 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.068) 0:00:04.396 ******* skipping: [managed-node1] => (item={'action': 'never', 'filter': 'filesystem', 'field': 'fstype=debugfs', 'keyname': 'lsr_fs_dbg'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "never", "field": "fstype=debugfs", "filter": "filesystem", "keyname": "lsr_fs_dbg" }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'syscall': ['openat', 'openat2'], 'keyname': 'lsr_io_uring'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "filter": "exit", "keyname": "lsr_io_uring", "syscall": [ "openat", "openat2" ] }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'nowarn', 'syscall': 'open', 'keyname': 'lsr_arch_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "nowarn", "filter": "exit", "keyname": "lsr_arch_nw", "syscall": "open" }, "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': ['b32', 'b64'], 'syscall': 'read', 'keyname': 'lsr_arch_pair'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": [ "b32", "b64" ], "filter": "exit", "keyname": "lsr_arch_pair", "syscall": "read" }, "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/hosts', 'permission': 'nowarn', 'keyname': 'lsr_path_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_path_nw", "path": "/etc/hosts", "permission": "nowarn", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'dir': '/etc/ssh', 'permission': 'read', 'keyname': 'lsr_dir_one_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/etc/ssh", "filter": "exit", "keyname": "lsr_dir_one_perm", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'openat', 'dir': '/tmp', 'permission': ['read', 'write'], 'keyname': 'lsr_dir_many_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/tmp", "filter": "exit", "keyname": "lsr_dir_many_perm", "permission": [ "read", "write" ], "syscall": "openat" } } MSG: All assertions passed skipping: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': ['open', 'close'], 'keyname': 'lsr_sc_list'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_sc_list", "syscall": [ "open", "close" ] }, "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/issue', 'permission': 'read', 'field': ['success=1', 'auid!=-1', 'exit<-1', 'gid>10', 'euid<=65534', 'pid>=0', 'a1&0100', 'a2&=0200'], 'keyname': ['lsr_ops', 'lsr_ops2', 'lsr_ops3']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "field": [ "success=1", "auid!=-1", "exit<-1", "gid>10", "euid<=65534", "pid>=0", "a1&0100", "a2&=0200" ], "filter": "exit", "keyname": [ "lsr_ops", "lsr_ops2", "lsr_ops3" ], "path": "/etc/issue", "permission": "read", "syscall": "open" } } MSG: All assertions passed skipping: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'dup', 'keyname': ['kk1', 'kk2']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": [ "kk1", "kk2" ], "syscall": "dup" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.auditd : Assert auditd_rules syscall requires valid filter type] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:64 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.069) 0:00:04.465 ******* skipping: [managed-node1] => (item={'action': 'never', 'filter': 'filesystem', 'field': 'fstype=debugfs', 'keyname': 'lsr_fs_dbg'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "never", "field": "fstype=debugfs", "filter": "filesystem", "keyname": "lsr_fs_dbg" }, "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'syscall': ['openat', 'openat2'], 'keyname': 'lsr_io_uring'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "filter": "exit", "keyname": "lsr_io_uring", "syscall": [ "openat", "openat2" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'nowarn', 'syscall': 'open', 'keyname': 'lsr_arch_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "nowarn", "filter": "exit", "keyname": "lsr_arch_nw", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': ['b32', 'b64'], 'syscall': 'read', 'keyname': 'lsr_arch_pair'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": [ "b32", "b64" ], "filter": "exit", "keyname": "lsr_arch_pair", "syscall": "read" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/hosts', 'permission': 'nowarn', 'keyname': 'lsr_path_nw'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_path_nw", "path": "/etc/hosts", "permission": "nowarn", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'dir': '/etc/ssh', 'permission': 'read', 'keyname': 'lsr_dir_one_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/etc/ssh", "filter": "exit", "keyname": "lsr_dir_one_perm", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'openat', 'dir': '/tmp', 'permission': ['read', 'write'], 'keyname': 'lsr_dir_many_perm'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "dir": "/tmp", "filter": "exit", "keyname": "lsr_dir_many_perm", "permission": [ "read", "write" ], "syscall": "openat" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': ['open', 'close'], 'keyname': 'lsr_sc_list'}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": "lsr_sc_list", "syscall": [ "open", "close" ] } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'open', 'path': '/etc/issue', 'permission': 'read', 'field': ['success=1', 'auid!=-1', 'exit<-1', 'gid>10', 'euid<=65534', 'pid>=0', 'a1&0100', 'a2&=0200'], 'keyname': ['lsr_ops', 'lsr_ops2', 'lsr_ops3']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "field": [ "success=1", "auid!=-1", "exit<-1", "gid>10", "euid<=65534", "pid>=0", "a1&0100", "a2&=0200" ], "filter": "exit", "keyname": [ "lsr_ops", "lsr_ops2", "lsr_ops3" ], "path": "/etc/issue", "permission": "read", "syscall": "open" } } MSG: All assertions passed ok: [managed-node1] => (item={'action': 'always', 'filter': 'exit', 'arch': 'b64', 'syscall': 'dup', 'keyname': ['kk1', 'kk2']}) => { "ansible_loop_var": "item", "changed": false, "item": { "action": "always", "arch": "b64", "filter": "exit", "keyname": [ "kk1", "kk2" ], "syscall": "dup" } } MSG: All assertions passed TASK [fedora.linux_system_roles.auditd : Assert auditd_rules field is a non-empty string or list of non-empty strings] *** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:74 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.115) 0:00:04.581 ******* fatal: [managed-node1]: FAILED! => {} MSG: The conditional check 'item.field is auditd_non_empty_str_or_list' failed. The error was: No test named 'auditd_non_empty_str_or_list' found. TASK [Cat /etc/audit/audit.rules] ********************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:266 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.022) 0:00:04.603 ******* fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": [ "cat", "/etc/audit/audit.rules" ], "delta": "0:00:00.002242", "end": "2026-04-22 08:48:48.839576", "failed_when_result": true, "rc": 0, "start": "2026-04-22 08:48:48.837334" } STDOUT: ## This file is automatically generated from /etc/audit/rules.d -D -b 8192 -f 1 --backlog_wait_time 60000 TASK [Restore system state after integration test] ***************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:272 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.418) 0:00:05.022 ******* included: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml for managed-node1 TASK [Restore auditd.conf from integration test backup] ************************ task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:3 Wednesday 22 April 2026 08:48:48 -0400 (0:00:00.018) 0:00:05.040 ******* ok: [managed-node1] => { "changed": false, "checksum": "92826a0a4167a0ec9df5fc3f743dd551e25be805", "dest": "/etc/audit/auditd.conf", "gid": 0, "group": "root", "md5sum": "fd5c639b8b1bd57c486dab75985ad9af", "mode": "0640", "owner": "root", "secontext": "system_u:object_r:auditd_etc_t:s0", "size": 882, "src": "/root/.lsr_auditd_integration_auditd.conf.bak", "state": "file", "uid": 0 } TASK [Remove auditd.conf backup file] ****************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:11 Wednesday 22 April 2026 08:48:49 -0400 (0:00:00.343) 0:00:05.384 ******* changed: [managed-node1] => { "changed": true, "path": "/root/.lsr_auditd_integration_auditd.conf.bak", "state": "absent" } TASK [Restore custom.rules from integration test backup] *********************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:17 Wednesday 22 April 2026 08:48:49 -0400 (0:00:00.431) 0:00:05.815 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove custom.rules backup file] ***************************************** task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:25 Wednesday 22 April 2026 08:48:49 -0400 (0:00:00.013) 0:00:05.828 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Remove custom.rules if it did not exist before test] ********************* task path: /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:31 Wednesday 22 April 2026 08:48:49 -0400 (0:00:00.012) 0:00:05.841 ******* ok: [managed-node1] => { "changed": false, "path": "/etc/audit/rules.d/custom.rules", "state": "absent" } PLAY RECAP ********************************************************************* managed-node1 : ok=46 changed=2 unreachable=0 failed=1 skipped=12 rescued=1 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-04-22T12:48:48.465360+00:00Z", "host": "managed-node1", "message": "The conditional check 'item.field is auditd_non_empty_str_or_list' failed. The error was: No test named 'auditd_non_empty_str_or_list' found.", "start_time": "2026-04-22T12:48:48.445868+00:00Z", "task_name": "Assert auditd_rules field is a non-empty string or list of non-empty strings", "task_path": "/tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:74" }, { "ansible_version": "2.9.27", "delta": "0:00:00.002242", "end_time": "2026-04-22 08:48:48.839576", "host": "managed-node1", "message": "No message could be found", "rc": 0, "start_time": "2026-04-22 08:48:48.837334", "stdout": "## This file is automatically generated from /etc/audit/rules.d\n-D\n-b 8192\n-f 1\n--backlog_wait_time 60000", "task_name": "Cat /etc/audit/audit.rules", "task_path": "/tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:266" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 22 April 2026 08:48:50 -0400 (0:00:00.328) 0:00:06.170 ******* =============================================================================== Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:4 fedora.linux_system_roles.auditd : Ensure ansible_facts used by role ---- 0.70s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:2 Back up auditd.conf before test ----------------------------------------- 0.44s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:21 Check for existing auditd.conf ------------------------------------------ 0.44s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:7 Remove auditd.conf backup file ------------------------------------------ 0.43s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:11 Cat /etc/audit/audit.rules ---------------------------------------------- 0.42s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:266 Restore auditd.conf from integration test backup ------------------------ 0.34s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:3 Remove custom.rules if it did not exist before test --------------------- 0.33s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/cleanup.yml:31 fedora.linux_system_roles.auditd : Check if system is ostree ------------ 0.32s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:10 Check for existing custom.rules ----------------------------------------- 0.32s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:12 fedora.linux_system_roles.auditd : Assert each auditd_rules entry has required action and filter --- 0.13s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:13 fedora.linux_system_roles.auditd : Assert auditd_rules syscall requires valid filter type --- 0.12s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:64 fedora.linux_system_roles.auditd : Assert auditd_rules entries use only supported keys --- 0.09s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:29 fedora.linux_system_roles.auditd : Assert auditd_rules path or dir requires filter exit --- 0.07s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:56 fedora.linux_system_roles.auditd : Assert auditd_rules path and dir are mutually exclusive --- 0.07s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_auditd_rules.yml:49 Run the role normally --------------------------------------------------- 0.06s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/run_role_with_clear_facts.yml:33 fedora.linux_system_roles.auditd : Set platform/version specific variables --- 0.04s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/set_vars.yml:19 fedora.linux_system_roles.auditd : Validate auditd_rules structure and values --- 0.04s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/roles/auditd/tasks/assert_role_vars.yml:215 Remember whether custom.rules existed ----------------------------------- 0.03s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tasks/setup.yml:17 Set filter based on platform/version ------------------------------------ 0.03s /tmp/collections-kxw/ansible_collections/fedora/linux_system_roles/tests/auditd/tests_auditd_integration.yml:12 -- Logs begin at Wed 2026-04-22 08:45:53 EDT, end at Wed 2026-04-22 08:48:50 EDT. -- Apr 22 08:48:43 managed-node1 sshd[7222]: Accepted publickey for root from 10.31.15.23 port 37782 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Apr 22 08:48:43 managed-node1 systemd-logind[609]: New session 6 of user root. -- Subject: A new session 6 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 6 has been created for the user root. -- -- The leading process of the session is 7222. Apr 22 08:48:43 managed-node1 systemd[1]: Started Session 6 of user root. -- Subject: Unit session-6.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-6.scope has finished starting up. -- -- The start-up result is done. Apr 22 08:48:43 managed-node1 sshd[7222]: pam_unix(sshd:session): session opened for user root by (uid=0) Apr 22 08:48:43 managed-node1 sshd[7225]: Received disconnect from 10.31.15.23 port 37782:11: disconnected by user Apr 22 08:48:43 managed-node1 sshd[7225]: Disconnected from user root 10.31.15.23 port 37782 Apr 22 08:48:43 managed-node1 sshd[7222]: pam_unix(sshd:session): session closed for user root Apr 22 08:48:43 managed-node1 systemd[1]: session-6.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-6.scope has successfully entered the 'dead' state. Apr 22 08:48:43 managed-node1 systemd-logind[609]: Session 6 logged out. Waiting for processes to exit. Apr 22 08:48:43 managed-node1 systemd-logind[609]: Removed session 6. -- Subject: Session 6 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 6 has been terminated. Apr 22 08:48:43 managed-node1 sshd[7245]: Accepted publickey for root from 10.31.15.23 port 37786 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Apr 22 08:48:43 managed-node1 systemd[1]: Started Session 7 of user root. -- Subject: Unit session-7.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-7.scope has finished starting up. -- -- The start-up result is done. Apr 22 08:48:43 managed-node1 systemd-logind[609]: New session 7 of user root. -- Subject: A new session 7 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 7 has been created for the user root. -- -- The leading process of the session is 7245. Apr 22 08:48:43 managed-node1 sshd[7245]: pam_unix(sshd:session): session opened for user root by (uid=0) Apr 22 08:48:43 managed-node1 sshd[7248]: Received disconnect from 10.31.15.23 port 37786:11: disconnected by user Apr 22 08:48:43 managed-node1 sshd[7248]: Disconnected from user root 10.31.15.23 port 37786 Apr 22 08:48:43 managed-node1 sshd[7245]: pam_unix(sshd:session): session closed for user root Apr 22 08:48:43 managed-node1 systemd[1]: session-7.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-7.scope has successfully entered the 'dead' state. Apr 22 08:48:43 managed-node1 systemd-logind[609]: Session 7 logged out. Waiting for processes to exit. Apr 22 08:48:43 managed-node1 systemd-logind[609]: Removed session 7. -- Subject: Session 7 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 7 has been terminated. Apr 22 08:48:43 managed-node1 sshd[7270]: Accepted publickey for root from 10.31.15.23 port 37796 ssh2: ECDSA SHA256:spwTwN/jGN4Hz6RX6sRARY55UcuqH9JP+Hz3/+veAdI Apr 22 08:48:43 managed-node1 systemd-logind[609]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 7270. Apr 22 08:48:43 managed-node1 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Apr 22 08:48:43 managed-node1 sshd[7270]: pam_unix(sshd:session): session opened for user root by (uid=0) Apr 22 08:48:44 managed-node1 platform-python[7415]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Apr 22 08:48:45 managed-node1 platform-python[7563]: ansible-ansible.builtin.stat Invoked with path=/etc/audit/auditd.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Apr 22 08:48:45 managed-node1 platform-python[7688]: ansible-ansible.builtin.stat Invoked with path=/etc/audit/rules.d/custom.rules follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Apr 22 08:48:46 managed-node1 platform-python[7811]: ansible-copy Invoked with src=/etc/audit/auditd.conf dest=/root/.lsr_auditd_integration_auditd.conf.bak remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Apr 22 08:48:47 managed-node1 platform-python[7975]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Apr 22 08:48:47 managed-node1 platform-python[8102]: ansible-ansible.builtin.stat Invoked with path=/run/ostree-booted follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Apr 22 08:48:48 managed-node1 platform-python[8225]: ansible-ansible.builtin.command Invoked with _raw_params=cat /etc/audit/audit.rules warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Apr 22 08:48:49 managed-node1 platform-python[8349]: ansible-copy Invoked with src=/root/.lsr_auditd_integration_auditd.conf.bak dest=/etc/audit/auditd.conf remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Apr 22 08:48:49 managed-node1 platform-python[8472]: ansible-ansible.builtin.file Invoked with path=/root/.lsr_auditd_integration_auditd.conf.bak state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Apr 22 08:48:49 managed-node1 platform-python[8595]: ansible-ansible.builtin.file Invoked with path=/etc/audit/rules.d/custom.rules state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Apr 22 08:48:50 managed-node1 sshd[8616]: Accepted publickey for root from 10.31.15.23 port 50354 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Apr 22 08:48:50 managed-node1 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Apr 22 08:48:50 managed-node1 systemd-logind[609]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 8616. Apr 22 08:48:50 managed-node1 sshd[8616]: pam_unix(sshd:session): session opened for user root by (uid=0) Apr 22 08:48:50 managed-node1 sshd[8619]: Received disconnect from 10.31.15.23 port 50354:11: disconnected by user Apr 22 08:48:50 managed-node1 sshd[8619]: Disconnected from user root 10.31.15.23 port 50354 Apr 22 08:48:50 managed-node1 sshd[8616]: pam_unix(sshd:session): session closed for user root Apr 22 08:48:50 managed-node1 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Apr 22 08:48:50 managed-node1 systemd-logind[609]: Session 9 logged out. Waiting for processes to exit. Apr 22 08:48:50 managed-node1 systemd-logind[609]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Apr 22 08:48:50 managed-node1 sshd[8640]: Accepted publickey for root from 10.31.15.23 port 50358 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Apr 22 08:48:50 managed-node1 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Apr 22 08:48:50 managed-node1 systemd-logind[609]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 8640. Apr 22 08:48:50 managed-node1 sshd[8640]: pam_unix(sshd:session): session opened for user root by (uid=0)