[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 27844 1726882740.25392: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 27844 1726882740.25835: Added group all to inventory 27844 1726882740.25837: Added group ungrouped to inventory 27844 1726882740.25841: Group all now contains ungrouped 27844 1726882740.25845: Examining possible inventory source: /tmp/network-91m/inventory.yml 27844 1726882740.48536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 27844 1726882740.48603: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 27844 1726882740.48626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 27844 1726882740.48687: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 27844 1726882740.48760: Loaded config def from plugin (inventory/script) 27844 1726882740.48762: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 27844 1726882740.48805: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 27844 1726882740.48893: Loaded config def from plugin (inventory/yaml) 27844 1726882740.48895: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 27844 1726882740.48978: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 27844 1726882740.49593: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 27844 1726882740.49597: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 27844 1726882740.49600: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 27844 1726882740.49607: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 27844 1726882740.49612: Loading data from /tmp/network-91m/inventory.yml 27844 1726882740.49816: /tmp/network-91m/inventory.yml was not parsable by auto 27844 1726882740.50152: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 27844 1726882740.50199: Loading data from /tmp/network-91m/inventory.yml 27844 1726882740.50292: group all already in inventory 27844 1726882740.50300: set inventory_file for managed_node1 27844 1726882740.50304: set inventory_dir for managed_node1 27844 1726882740.50305: Added host managed_node1 to inventory 27844 1726882740.50308: Added host managed_node1 to group all 27844 1726882740.50313: set ansible_host for managed_node1 27844 1726882740.50314: set ansible_ssh_extra_args for managed_node1 27844 1726882740.50318: set inventory_file for managed_node2 27844 1726882740.50321: set inventory_dir for managed_node2 27844 1726882740.50322: Added host managed_node2 to inventory 27844 1726882740.50323: Added host managed_node2 to group all 27844 1726882740.50324: set ansible_host for managed_node2 27844 1726882740.50325: set ansible_ssh_extra_args for managed_node2 27844 1726882740.50328: set inventory_file for managed_node3 27844 1726882740.50330: set inventory_dir for managed_node3 27844 1726882740.50331: Added host managed_node3 to inventory 27844 1726882740.50332: Added host managed_node3 to group all 27844 1726882740.50333: set ansible_host for managed_node3 27844 1726882740.50334: set ansible_ssh_extra_args for managed_node3 27844 1726882740.50337: Reconcile groups and hosts in inventory. 27844 1726882740.50341: Group ungrouped now contains managed_node1 27844 1726882740.50343: Group ungrouped now contains managed_node2 27844 1726882740.50345: Group ungrouped now contains managed_node3 27844 1726882740.50721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 27844 1726882740.50887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 27844 1726882740.51122: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 27844 1726882740.51155: Loaded config def from plugin (vars/host_group_vars) 27844 1726882740.51158: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 27844 1726882740.51167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 27844 1726882740.51175: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 27844 1726882740.51220: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 27844 1726882740.51641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882740.51742: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 27844 1726882740.51786: Loaded config def from plugin (connection/local) 27844 1726882740.51789: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 27844 1726882740.52723: Loaded config def from plugin (connection/paramiko_ssh) 27844 1726882740.52726: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 27844 1726882740.54100: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27844 1726882740.54145: Loaded config def from plugin (connection/psrp) 27844 1726882740.54148: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 27844 1726882740.55590: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27844 1726882740.55733: Loaded config def from plugin (connection/ssh) 27844 1726882740.55737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 27844 1726882740.60023: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27844 1726882740.60112: Loaded config def from plugin (connection/winrm) 27844 1726882740.60116: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 27844 1726882740.60154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 27844 1726882740.60255: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 27844 1726882740.60328: Loaded config def from plugin (shell/cmd) 27844 1726882740.60330: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 27844 1726882740.60361: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 27844 1726882740.60471: Loaded config def from plugin (shell/powershell) 27844 1726882740.60474: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 27844 1726882740.60597: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 27844 1726882740.60867: Loaded config def from plugin (shell/sh) 27844 1726882740.60870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 27844 1726882740.60976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 27844 1726882740.61492: Loaded config def from plugin (become/runas) 27844 1726882740.61525: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 27844 1726882740.61887: Loaded config def from plugin (become/su) 27844 1726882740.61889: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 27844 1726882740.62125: Loaded config def from plugin (become/sudo) 27844 1726882740.62127: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 27844 1726882740.62166: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27844 1726882740.62624: in VariableManager get_vars() 27844 1726882740.62645: done with get_vars() 27844 1726882740.62798: trying /usr/local/lib/python3.12/site-packages/ansible/modules 27844 1726882740.67674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 27844 1726882740.67802: in VariableManager get_vars() 27844 1726882740.67807: done with get_vars() 27844 1726882740.67809: variable 'playbook_dir' from source: magic vars 27844 1726882740.67810: variable 'ansible_playbook_python' from source: magic vars 27844 1726882740.67811: variable 'ansible_config_file' from source: magic vars 27844 1726882740.67812: variable 'groups' from source: magic vars 27844 1726882740.67813: variable 'omit' from source: magic vars 27844 1726882740.67813: variable 'ansible_version' from source: magic vars 27844 1726882740.67814: variable 'ansible_check_mode' from source: magic vars 27844 1726882740.67815: variable 'ansible_diff_mode' from source: magic vars 27844 1726882740.67816: variable 'ansible_forks' from source: magic vars 27844 1726882740.67816: variable 'ansible_inventory_sources' from source: magic vars 27844 1726882740.67817: variable 'ansible_skip_tags' from source: magic vars 27844 1726882740.67818: variable 'ansible_limit' from source: magic vars 27844 1726882740.67819: variable 'ansible_run_tags' from source: magic vars 27844 1726882740.67819: variable 'ansible_verbosity' from source: magic vars 27844 1726882740.67863: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 27844 1726882740.68934: in VariableManager get_vars() 27844 1726882740.68956: done with get_vars() 27844 1726882740.68998: in VariableManager get_vars() 27844 1726882740.69011: done with get_vars() 27844 1726882740.69045: in VariableManager get_vars() 27844 1726882740.69056: done with get_vars() 27844 1726882740.69177: in VariableManager get_vars() 27844 1726882740.69190: done with get_vars() 27844 1726882740.69227: in VariableManager get_vars() 27844 1726882740.69238: done with get_vars() 27844 1726882740.69293: in VariableManager get_vars() 27844 1726882740.69306: done with get_vars() 27844 1726882740.69359: in VariableManager get_vars() 27844 1726882740.69483: done with get_vars() 27844 1726882740.69488: variable 'omit' from source: magic vars 27844 1726882740.69513: variable 'omit' from source: magic vars 27844 1726882740.69546: in VariableManager get_vars() 27844 1726882740.69556: done with get_vars() 27844 1726882740.69606: in VariableManager get_vars() 27844 1726882740.69619: done with get_vars() 27844 1726882740.69651: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27844 1726882740.69941: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27844 1726882740.70081: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27844 1726882740.71931: in VariableManager get_vars() 27844 1726882740.71946: done with get_vars() 27844 1726882740.73825: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 27844 1726882740.74124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882740.77349: in VariableManager get_vars() 27844 1726882740.77372: done with get_vars() 27844 1726882740.77378: variable 'omit' from source: magic vars 27844 1726882740.77394: variable 'omit' from source: magic vars 27844 1726882740.77428: in VariableManager get_vars() 27844 1726882740.77442: done with get_vars() 27844 1726882740.77470: in VariableManager get_vars() 27844 1726882740.77487: done with get_vars() 27844 1726882740.77521: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27844 1726882740.79355: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27844 1726882740.79432: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27844 1726882740.80256: in VariableManager get_vars() 27844 1726882740.80280: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882740.85156: in VariableManager get_vars() 27844 1726882740.85184: done with get_vars() 27844 1726882740.85259: in VariableManager get_vars() 27844 1726882740.85281: done with get_vars() 27844 1726882740.85396: in VariableManager get_vars() 27844 1726882740.85415: done with get_vars() 27844 1726882740.85461: in VariableManager get_vars() 27844 1726882740.88433: done with get_vars() 27844 1726882740.88479: in VariableManager get_vars() 27844 1726882740.88507: done with get_vars() 27844 1726882740.88546: in VariableManager get_vars() 27844 1726882740.88567: done with get_vars() 27844 1726882740.88635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 27844 1726882740.88650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 27844 1726882740.88901: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 27844 1726882740.89128: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 27844 1726882740.89131: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 27844 1726882740.89170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 27844 1726882740.89196: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 27844 1726882740.89701: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 27844 1726882740.89772: Loaded config def from plugin (callback/default) 27844 1726882740.89775: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 27844 1726882740.91140: Loaded config def from plugin (callback/junit) 27844 1726882740.91143: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 27844 1726882740.91213: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 27844 1726882740.92172: Loaded config def from plugin (callback/minimal) 27844 1726882740.92175: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 27844 1726882740.92219: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 27844 1726882740.92292: Loaded config def from plugin (callback/tree) 27844 1726882740.92294: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 27844 1726882740.92438: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 27844 1726882740.92440: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27844 1726882740.92497: in VariableManager get_vars() 27844 1726882740.92513: done with get_vars() 27844 1726882740.92519: in VariableManager get_vars() 27844 1726882740.92528: done with get_vars() 27844 1726882740.92532: variable 'omit' from source: magic vars 27844 1726882740.92573: in VariableManager get_vars() 27844 1726882740.92589: done with get_vars() 27844 1726882740.92643: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 27844 1726882740.93234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 27844 1726882740.93424: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 27844 1726882740.94294: getting the remaining hosts for this loop 27844 1726882740.94296: done getting the remaining hosts for this loop 27844 1726882740.94299: getting the next task for host managed_node1 27844 1726882740.94303: done getting next task for host managed_node1 27844 1726882740.94305: ^ task is: TASK: Gathering Facts 27844 1726882740.94307: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882740.94310: getting variables 27844 1726882740.94311: in VariableManager get_vars() 27844 1726882740.94321: Calling all_inventory to load vars for managed_node1 27844 1726882740.94324: Calling groups_inventory to load vars for managed_node1 27844 1726882740.94326: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882740.94339: Calling all_plugins_play to load vars for managed_node1 27844 1726882740.94351: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882740.94354: Calling groups_plugins_play to load vars for managed_node1 27844 1726882740.94494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882740.94548: done with get_vars() 27844 1726882740.94554: done getting variables 27844 1726882740.94735: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Friday 20 September 2024 21:39:00 -0400 (0:00:00.024) 0:00:00.024 ****** 27844 1726882740.94756: entering _queue_task() for managed_node1/gather_facts 27844 1726882740.94758: Creating lock for gather_facts 27844 1726882740.95541: worker is 1 (out of 1 available) 27844 1726882740.95553: exiting _queue_task() for managed_node1/gather_facts 27844 1726882740.95687: done queuing things up, now waiting for results queue to drain 27844 1726882740.95709: waiting for pending results... 27844 1726882740.96307: running TaskExecutor() for managed_node1/TASK: Gathering Facts 27844 1726882740.96502: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000bf 27844 1726882740.96618: variable 'ansible_search_path' from source: unknown 27844 1726882740.96694: calling self._execute() 27844 1726882740.96836: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882740.96884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882740.96925: variable 'omit' from source: magic vars 27844 1726882740.97144: variable 'omit' from source: magic vars 27844 1726882740.97201: variable 'omit' from source: magic vars 27844 1726882740.97251: variable 'omit' from source: magic vars 27844 1726882740.97345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882740.97444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882740.97535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882740.97554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882740.97591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882740.97752: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882740.97761: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882740.97774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882740.97919: Set connection var ansible_shell_type to sh 27844 1726882740.97945: Set connection var ansible_connection to ssh 27844 1726882740.97958: Set connection var ansible_pipelining to False 27844 1726882740.97978: Set connection var ansible_timeout to 10 27844 1726882740.98074: Set connection var ansible_shell_executable to /bin/sh 27844 1726882740.98085: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882740.98115: variable 'ansible_shell_executable' from source: unknown 27844 1726882740.98123: variable 'ansible_connection' from source: unknown 27844 1726882740.98130: variable 'ansible_module_compression' from source: unknown 27844 1726882740.98137: variable 'ansible_shell_type' from source: unknown 27844 1726882740.98143: variable 'ansible_shell_executable' from source: unknown 27844 1726882740.98150: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882740.98158: variable 'ansible_pipelining' from source: unknown 27844 1726882740.98176: variable 'ansible_timeout' from source: unknown 27844 1726882740.98222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882740.98592: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882740.98727: variable 'omit' from source: magic vars 27844 1726882740.98736: starting attempt loop 27844 1726882740.98741: running the handler 27844 1726882740.98759: variable 'ansible_facts' from source: unknown 27844 1726882740.98791: _low_level_execute_command(): starting 27844 1726882740.98834: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882741.01600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.01618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.01634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.01653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.01707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.02057: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.02077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.02097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.02109: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.02121: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.02134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.02147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.02162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.02282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.02295: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.02309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.02382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.02405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.02422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.02551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.04221: stdout chunk (state=3): >>>/root <<< 27844 1726882741.04412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.04415: stdout chunk (state=3): >>><<< 27844 1726882741.04417: stderr chunk (state=3): >>><<< 27844 1726882741.04471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882741.04475: _low_level_execute_command(): starting 27844 1726882741.04478: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435 `" && echo ansible-tmp-1726882741.0443697-27891-220359103952435="` echo /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435 `" ) && sleep 0' 27844 1726882741.06214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.06235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.06249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.06272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.06314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.06327: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.06342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.06389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.06402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.06414: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.06426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.06439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.06453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.06469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.06481: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.06493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.06570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.06853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.06877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.07008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.08893: stdout chunk (state=3): >>>ansible-tmp-1726882741.0443697-27891-220359103952435=/root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435 <<< 27844 1726882741.09089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.09092: stdout chunk (state=3): >>><<< 27844 1726882741.09095: stderr chunk (state=3): >>><<< 27844 1726882741.09374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882741.0443697-27891-220359103952435=/root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882741.09377: variable 'ansible_module_compression' from source: unknown 27844 1726882741.09380: ANSIBALLZ: Using generic lock for ansible.legacy.setup 27844 1726882741.09382: ANSIBALLZ: Acquiring lock 27844 1726882741.09384: ANSIBALLZ: Lock acquired: 139916607833536 27844 1726882741.09386: ANSIBALLZ: Creating module 27844 1726882741.46728: ANSIBALLZ: Writing module into payload 27844 1726882741.46897: ANSIBALLZ: Writing module 27844 1726882741.46931: ANSIBALLZ: Renaming module 27844 1726882741.46943: ANSIBALLZ: Done creating module 27844 1726882741.46984: variable 'ansible_facts' from source: unknown 27844 1726882741.46997: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882741.47011: _low_level_execute_command(): starting 27844 1726882741.47022: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 27844 1726882741.47665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.48180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.48213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.48253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.48271: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.48287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.48305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.48317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.48327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.48338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.48350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.48369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.48382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.48392: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.48405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.48484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.48506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.48522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.48657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.50324: stdout chunk (state=3): >>>PLATFORM <<< 27844 1726882741.50421: stdout chunk (state=3): >>>Linux FOUND <<< 27844 1726882741.50436: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 27844 1726882741.50648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.50651: stdout chunk (state=3): >>><<< 27844 1726882741.50653: stderr chunk (state=3): >>><<< 27844 1726882741.50788: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882741.50798 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 27844 1726882741.50802: _low_level_execute_command(): starting 27844 1726882741.50804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 27844 1726882741.51244: Sending initial data 27844 1726882741.51248: Sent initial data (1181 bytes) 27844 1726882741.52540: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.52543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.53350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882741.53353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.53356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.53652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.53756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.53968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.57699: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 27844 1726882741.58062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.58170: stderr chunk (state=3): >>><<< 27844 1726882741.58173: stdout chunk (state=3): >>><<< 27844 1726882741.58176: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882741.58272: variable 'ansible_facts' from source: unknown 27844 1726882741.58276: variable 'ansible_facts' from source: unknown 27844 1726882741.58279: variable 'ansible_module_compression' from source: unknown 27844 1726882741.58472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27844 1726882741.58476: variable 'ansible_facts' from source: unknown 27844 1726882741.58498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/AnsiballZ_setup.py 27844 1726882741.58649: Sending initial data 27844 1726882741.58652: Sent initial data (154 bytes) 27844 1726882741.59547: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.59560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.59579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.59596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.59649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.59662: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.59680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.59698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.59710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.59737: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.59753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.59773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.59789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.59802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.59812: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.59836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.59915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.59966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.59983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.60107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.61902: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882741.61995: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882741.62096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpg46c8wx9 /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/AnsiballZ_setup.py <<< 27844 1726882741.62190: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882741.65499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.65624: stderr chunk (state=3): >>><<< 27844 1726882741.65628: stdout chunk (state=3): >>><<< 27844 1726882741.65631: done transferring module to remote 27844 1726882741.65634: _low_level_execute_command(): starting 27844 1726882741.65636: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/ /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/AnsiballZ_setup.py && sleep 0' 27844 1726882741.66260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.66282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.66304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.66324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.66368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.66382: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.66407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.66426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.66438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.66450: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.66462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.66480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.66496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.66517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.66531: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.66546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.66632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.66649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.66668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.66798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.68659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882741.68663: stdout chunk (state=3): >>><<< 27844 1726882741.68669: stderr chunk (state=3): >>><<< 27844 1726882741.68752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882741.68756: _low_level_execute_command(): starting 27844 1726882741.68758: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/AnsiballZ_setup.py && sleep 0' 27844 1726882741.69313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882741.69329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.69344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.69361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.69405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.69418: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882741.69436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.69454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882741.69470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882741.69570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882741.69585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882741.69600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882741.69616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882741.69629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882741.69641: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882741.69654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882741.69734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882741.69754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882741.69775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882741.69911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882741.71879: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 27844 1726882741.71883: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 27844 1726882741.71938: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 27844 1726882741.71980: stdout chunk (state=3): >>>import 'posix' # <<< 27844 1726882741.72012: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 27844 1726882741.72015: stdout chunk (state=3): >>># installing zipimport hook <<< 27844 1726882741.72058: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 27844 1726882741.72108: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.72150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 27844 1726882741.72153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 27844 1726882741.72184: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3dc0> <<< 27844 1726882741.72213: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 27844 1726882741.72231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3b20> <<< 27844 1726882741.72270: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 27844 1726882741.72308: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3ac0> <<< 27844 1726882741.72326: stdout chunk (state=3): >>>import '_signal' # <<< 27844 1726882741.72348: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 27844 1726882741.72373: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58490> <<< 27844 1726882741.72399: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 27844 1726882741.72420: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58940> <<< 27844 1726882741.72434: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58670> <<< 27844 1726882741.72473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 27844 1726882741.72477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 27844 1726882741.72496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 27844 1726882741.72531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 27844 1726882741.72535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 27844 1726882741.72574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 27844 1726882741.72577: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f190> <<< 27844 1726882741.72596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 27844 1726882741.72609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 27844 1726882741.72696: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f220> <<< 27844 1726882741.72711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 27844 1726882741.72746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f940> <<< 27844 1726882741.72790: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f70880> <<< 27844 1726882741.72810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f08d90> <<< 27844 1726882741.72866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f32d90> <<< 27844 1726882741.72920: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58970> <<< 27844 1726882741.72946: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27844 1726882741.73286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 27844 1726882741.73289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 27844 1726882741.73324: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 27844 1726882741.73342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 27844 1726882741.73362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 27844 1726882741.73398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 27844 1726882741.73401: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eaeeb0> <<< 27844 1726882741.73445: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb1f40> <<< 27844 1726882741.73477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 27844 1726882741.73494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 27844 1726882741.73511: stdout chunk (state=3): >>>import '_sre' # <<< 27844 1726882741.73544: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 27844 1726882741.73555: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 27844 1726882741.73589: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ea7610> <<< 27844 1726882741.73606: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ead640> <<< 27844 1726882741.73622: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eae370> <<< 27844 1726882741.73625: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 27844 1726882741.73682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 27844 1726882741.73707: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 27844 1726882741.73729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.73757: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 27844 1726882741.73803: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e30d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30880> <<< 27844 1726882741.73817: stdout chunk (state=3): >>>import 'itertools' # <<< 27844 1726882741.73842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30e80> <<< 27844 1726882741.73845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 27844 1726882741.73879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 27844 1726882741.73911: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30f40> <<< 27844 1726882741.73934: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 27844 1726882741.73937: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30e50> import '_collections' # <<< 27844 1726882741.74001: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e89d00> <<< 27844 1726882741.74006: stdout chunk (state=3): >>>import '_functools' # <<< 27844 1726882741.74016: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e825e0> <<< 27844 1726882741.74085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e96640> <<< 27844 1726882741.74090: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb5df0> <<< 27844 1726882741.74117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 27844 1726882741.74131: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e42c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e89220> <<< 27844 1726882741.74188: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882741.74200: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e96250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ebb9a0> <<< 27844 1726882741.74222: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 27844 1726882741.74236: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.74277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 27844 1726882741.74280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 27844 1726882741.74304: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42d60> <<< 27844 1726882741.74332: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42cd0> <<< 27844 1726882741.74351: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 27844 1726882741.74371: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 27844 1726882741.74392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 27844 1726882741.74440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 27844 1726882741.74472: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b73340> <<< 27844 1726882741.74491: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 27844 1726882741.74505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 27844 1726882741.74531: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b73430> <<< 27844 1726882741.74657: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e4af70> <<< 27844 1726882741.74695: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e44a00> <<< 27844 1726882741.74714: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e444c0> <<< 27844 1726882741.74733: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 27844 1726882741.74770: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 27844 1726882741.74792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 27844 1726882741.74814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aa6190> <<< 27844 1726882741.74844: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b5dcd0> <<< 27844 1726882741.74921: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e44e80> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb5fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 27844 1726882741.74943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 27844 1726882741.74986: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ab8ac0> <<< 27844 1726882741.74992: stdout chunk (state=3): >>>import 'errno' # <<< 27844 1726882741.75018: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428ab8df0> <<< 27844 1726882741.75057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 27844 1726882741.75084: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 27844 1726882741.75094: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aca700> <<< 27844 1726882741.75110: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 27844 1726882741.75133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 27844 1726882741.75162: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428acac40> <<< 27844 1726882741.75221: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a58370> <<< 27844 1726882741.75232: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ab8ee0> <<< 27844 1726882741.75246: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 27844 1726882741.75299: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a68250> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aca580> <<< 27844 1726882741.75303: stdout chunk (state=3): >>>import 'pwd' # <<< 27844 1726882741.75327: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a68310> <<< 27844 1726882741.75376: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e429a0> <<< 27844 1726882741.75395: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 27844 1726882741.75427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 27844 1726882741.75431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 27844 1726882741.75443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 27844 1726882741.75481: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84670> <<< 27844 1726882741.75495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 27844 1726882741.75523: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a84730> <<< 27844 1726882741.75554: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84820> <<< 27844 1726882741.75589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 27844 1726882741.75789: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84c70> <<< 27844 1726882741.75826: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882741.75833: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a921c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a848b0> <<< 27844 1726882741.75849: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a77a00> <<< 27844 1726882741.75868: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42580> <<< 27844 1726882741.75891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 27844 1726882741.75945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 27844 1726882741.75998: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a84a60> <<< 27844 1726882741.76144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 27844 1726882741.76147: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f84289ad640> <<< 27844 1726882741.76386: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip' <<< 27844 1726882741.76390: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.76475: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.76521: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 27844 1726882741.76548: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 27844 1726882741.76551: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.77757: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.78688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 27844 1726882741.78714: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283047c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.78759: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 27844 1726882741.78774: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 27844 1726882741.78788: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428395790> <<< 27844 1726882741.78828: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428395670> <<< 27844 1726882741.78863: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283953d0> <<< 27844 1726882741.78880: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 27844 1726882741.78924: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283954c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283951c0> import 'atexit' # <<< 27844 1726882741.78957: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428395430> <<< 27844 1726882741.78975: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 27844 1726882741.78997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 27844 1726882741.79042: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283957f0> <<< 27844 1726882741.79077: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 27844 1726882741.79082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 27844 1726882741.79120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 27844 1726882741.79140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 27844 1726882741.79143: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 27844 1726882741.79223: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842836e7c0> <<< 27844 1726882741.79266: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842836eb50> <<< 27844 1726882741.79291: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882741.79315: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842836e9a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 27844 1726882741.79353: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282874f0> <<< 27844 1726882741.79370: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842838ed30> <<< 27844 1726882741.79561: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428395550> <<< 27844 1726882741.79573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 27844 1726882741.79599: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842838e160> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 27844 1726882741.79639: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 27844 1726882741.79677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 27844 1726882741.79700: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283bfa30> <<< 27844 1726882741.79803: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283621c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428362790> <<< 27844 1726882741.79807: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842828dd00> <<< 27844 1726882741.79838: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283626a0> <<< 27844 1726882741.79857: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283e3d00> <<< 27844 1726882741.79881: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 27844 1726882741.79896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 27844 1726882741.79926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 27844 1726882741.80018: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282e59a0> <<< 27844 1726882741.80040: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283eedf0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 27844 1726882741.80043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 27844 1726882741.80100: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882741.80107: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283eee20> <<< 27844 1726882741.80118: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 27844 1726882741.80142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.80173: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 27844 1726882741.80229: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283f5220> <<< 27844 1726882741.80365: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f5100> <<< 27844 1726882741.80463: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283b9b50> <<< 27844 1726882741.80489: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283eeac0> <<< 27844 1726882741.80529: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283eecd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283e3cd0> <<< 27844 1726882741.80553: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 27844 1726882741.80584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 27844 1726882741.80593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 27844 1726882741.80643: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f10d0> <<< 27844 1726882741.80831: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282e7370> <<< 27844 1726882741.80842: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f1d00> <<< 27844 1726882741.80885: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f2130> # zipimport: zlib available <<< 27844 1726882741.80917: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 27844 1726882741.80998: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.81080: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.81117: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.81130: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 27844 1726882741.81238: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.81334: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.81810: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.82285: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 27844 1726882741.82293: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 27844 1726882741.82319: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 27844 1726882741.82332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.82393: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882741.82404: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283028b0> <<< 27844 1726882741.82461: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 27844 1726882741.82482: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842832f8e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e946d0> <<< 27844 1726882741.82528: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 27844 1726882741.82560: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.82580: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 27844 1726882741.82701: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.82840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 27844 1726882741.82865: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842836c7c0> # zipimport: zlib available <<< 27844 1726882741.83271: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83646: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83697: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83765: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 27844 1726882741.83803: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83842: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 27844 1726882741.83845: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83896: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.83992: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 27844 1726882741.84006: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 27844 1726882741.84041: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84083: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 27844 1726882741.84089: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84270: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 27844 1726882741.84499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 27844 1726882741.84502: stdout chunk (state=3): >>>import '_ast' # <<< 27844 1726882741.84574: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e98d60> <<< 27844 1726882741.84577: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84631: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84718: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 27844 1726882741.84722: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 27844 1726882741.84734: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84762: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84805: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 27844 1726882741.84839: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84883: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.84983: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85048: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 27844 1726882741.85062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.85145: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428320070> <<< 27844 1726882741.85256: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e60f10> <<< 27844 1726882741.85298: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 27844 1726882741.85305: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 27844 1726882741.85348: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85408: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85431: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85474: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 27844 1726882741.85496: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 27844 1726882741.85499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 27844 1726882741.85537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 27844 1726882741.85550: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 27844 1726882741.85576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 27844 1726882741.85653: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428329190> <<< 27844 1726882741.85703: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428325d60> <<< 27844 1726882741.85756: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e98b80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 27844 1726882741.85769: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85792: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.85818: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 27844 1726882741.85893: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 27844 1726882741.85927: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.85946: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 27844 1726882741.85986: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86041: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86078: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.86114: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86151: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86182: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86217: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 27844 1726882741.86229: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86285: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86362: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86391: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86411: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 27844 1726882741.86554: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86700: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86733: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.86779: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882741.86809: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 27844 1726882741.86842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 27844 1726882741.86870: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427c13730> <<< 27844 1726882741.86894: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 27844 1726882741.86922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 27844 1726882741.86950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 27844 1726882741.86982: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 27844 1726882741.86994: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e59bb0> <<< 27844 1726882741.87030: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427e59c40> <<< 27844 1726882741.87089: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e4a1f0> <<< 27844 1726882741.87109: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427eba940> <<< 27844 1726882741.87138: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8220> <<< 27844 1726882741.87162: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8310> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 27844 1726882741.87188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 27844 1726882741.87207: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 27844 1726882741.87237: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427ea4ca0> <<< 27844 1726882741.87276: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e75d00> <<< 27844 1726882741.87288: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 27844 1726882741.87319: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea4340> <<< 27844 1726882741.87331: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 27844 1726882741.87350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 27844 1726882741.87407: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427c7bf40> <<< 27844 1726882741.87434: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea4fd0> <<< 27844 1726882741.87545: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.87558: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.87599: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 27844 1726882741.87615: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87641: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87705: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 27844 1726882741.87727: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 27844 1726882741.87749: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87784: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 27844 1726882741.87837: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87886: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 27844 1726882741.87889: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87920: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.87965: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 27844 1726882741.88019: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.88070: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.88131: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.88182: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 27844 1726882741.88185: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.88575: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.88942: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 27844 1726882741.88989: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89035: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89074: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89109: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 27844 1726882741.89119: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89145: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89177: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 27844 1726882741.89180: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89214: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89267: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 27844 1726882741.89304: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89341: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 27844 1726882741.89345: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89370: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89388: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 27844 1726882741.89455: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89527: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 27844 1726882741.89551: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427bd0f40> <<< 27844 1726882741.89585: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 27844 1726882741.89598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 27844 1726882741.89752: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427bd0b80> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 27844 1726882741.89769: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89817: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89882: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 27844 1726882741.89885: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.89953: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90037: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 27844 1726882741.90089: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90170: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 27844 1726882741.90173: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90201: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 27844 1726882741.90271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 27844 1726882741.90418: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427bcfd00> <<< 27844 1726882741.90676: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8580> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 27844 1726882741.90679: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90715: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90773: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 27844 1726882741.90838: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.90917: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91003: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91146: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 27844 1726882741.91158: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91181: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91221: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 27844 1726882741.91233: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91258: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91398: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427b00520> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b00a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 27844 1726882741.91414: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91439: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91486: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 27844 1726882741.91625: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91742: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 27844 1726882741.91753: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91831: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91914: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.91945: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92006: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 27844 1726882741.92011: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 27844 1726882741.92091: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92117: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92226: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92357: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 27844 1726882741.92361: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92459: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92575: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 27844 1726882741.92597: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.92631: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93087: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93498: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 27844 1726882741.93515: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93593: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93694: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 27844 1726882741.93697: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93773: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.93860: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 27844 1726882741.93990: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94122: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 27844 1726882741.94148: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 27844 1726882741.94165: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94196: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94232: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 27844 1726882741.94244: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94325: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94409: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94581: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94768: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 27844 1726882741.94778: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94791: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94848: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 27844 1726882741.94851: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94887: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 27844 1726882741.94891: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.94942: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95021: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 27844 1726882741.95025: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95052: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 27844 1726882741.95068: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95110: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95169: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 27844 1726882741.95183: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95219: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95276: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 27844 1726882741.95531: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95706: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 27844 1726882741.95757: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95816: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 27844 1726882741.95819: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95844: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95890: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 27844 1726882741.95893: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95913: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95941: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 27844 1726882741.95954: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.95978: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96016: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 27844 1726882741.96019: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96085: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96170: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 27844 1726882741.96199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 27844 1726882741.96202: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96233: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96276: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 27844 1726882741.96297: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27844 1726882741.96321: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96360: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96399: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96457: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96529: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 27844 1726882741.96547: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 27844 1726882741.96586: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96635: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 27844 1726882741.96638: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96798: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.96961: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 27844 1726882741.96967: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97001: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97051: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 27844 1726882741.97055: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97089: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97136: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 27844 1726882741.97207: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97286: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 27844 1726882741.97289: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97358: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97443: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 27844 1726882741.97514: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882741.97694: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 27844 1726882741.97720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 27844 1726882741.97751: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842796d6d0> <<< 27844 1726882741.97768: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842796de20> <<< 27844 1726882741.97819: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84279774c0> <<< 27844 1726882741.99546: stdout chunk (state=3): >>>import 'gc' # <<< 27844 1726882742.01573: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 27844 1726882742.01577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 27844 1726882742.01611: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842796d100> <<< 27844 1726882742.01615: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 27844 1726882742.01628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 27844 1726882742.01651: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842798b130> <<< 27844 1726882742.01705: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.01745: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 27844 1726882742.01765: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b4d880> <<< 27844 1726882742.01769: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b4d8b0> <<< 27844 1726882742.02013: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 27844 1726882742.02029: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 27844 1726882742.26638: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5Gg<<< 27844 1726882742.26666: stdout chunk (state=3): >>>D5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 900, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238911488, "block_size": 4096, "block_total": 65519355, "block_available": 64511453, "block_used": 1007902, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "02", "epoch": "1726882742", "epoch_int": "1726882742", "date": "2024-09-20", "time": "21:39:02", "iso8601_micro": "2024-09-21T01:39:02.218921Z", "iso8601": "2024-09-21T01:39:02Z", "iso8601_basic": "20240920T213902218921", "iso8601_basic_short": "20240920T213902", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.73, "5m": 0.51, "15m": 0.29}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27844 1726882742.27155: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale <<< 27844 1726882742.27225: stdout chunk (state=3): >>># cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 27844 1726882742.27254: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info <<< 27844 1726882742.27281: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout <<< 27844 1726882742.27328: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl <<< 27844 1726882742.27346: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 27844 1726882742.27607: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 27844 1726882742.27621: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 27844 1726882742.27658: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 27844 1726882742.27702: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 27844 1726882742.27725: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 27844 1726882742.27739: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 27844 1726882742.27783: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 27844 1726882742.27831: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 27844 1726882742.27866: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 27844 1726882742.27896: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 27844 1726882742.27941: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 27844 1726882742.27987: stdout chunk (state=3): >>># destroy json <<< 27844 1726882742.27990: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 27844 1726882742.28052: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 27844 1726882742.28097: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess <<< 27844 1726882742.28173: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 27844 1726882742.28248: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre <<< 27844 1726882742.28304: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 27844 1726882742.28307: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 27844 1726882742.28353: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 27844 1726882742.28539: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 27844 1726882742.28586: stdout chunk (state=3): >>># destroy tokenize <<< 27844 1726882742.28589: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 27844 1726882742.28614: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 27844 1726882742.28650: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 27844 1726882742.29028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882742.29031: stdout chunk (state=3): >>><<< 27844 1726882742.29034: stderr chunk (state=3): >>><<< 27844 1726882742.29195: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428fb3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f0f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f70880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f08d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f32d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428f58970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eaeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ea7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ead640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e30d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30880> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30e80> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30f40> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e30e50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e89d00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e825e0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e96640> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb5df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e42c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e89220> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428e96250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ebb9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42d60> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42cd0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b73340> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b73430> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e4af70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e44a00> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e444c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aa6190> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428b5dcd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e44e80> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428eb5fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ab8ac0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428ab8df0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aca700> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428acac40> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a58370> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428ab8ee0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a68250> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428aca580> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a68310> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e429a0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84670> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a84730> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84820> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a84c70> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428a921c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a848b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a77a00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428e42580> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428a84a60> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f84289ad640> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283047c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428395790> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428395670> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283953d0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283954c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283951c0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428395430> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283957f0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842836e7c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842836eb50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842836e9a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282874f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842838ed30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428395550> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842838e160> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283bfa30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283621c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428362790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842828dd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283626a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283e3d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282e59a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283eedf0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f50d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283eee20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283f5220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f5100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283b9b50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283eeac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283eecd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84283e3cd0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f10d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282e7370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f1d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84282f16a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84282f2130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84283028b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842832f8e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e946d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842836c7c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e98d60> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8428320070> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e60f10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428329190> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8428325d60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e98b80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427c13730> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e59bb0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427e59c40> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e4a1f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427eba940> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8220> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8310> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427ea4ca0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427e75d00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea4340> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427c7bf40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea4fd0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427bd0f40> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427bd0b80> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427bcfd00> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427ea8580> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8427b00520> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b00a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_b8unb2vc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f842796d6d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842796de20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84279774c0> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842796d100> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f842798b130> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b4d880> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8427b4d8b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 900, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238911488, "block_size": 4096, "block_total": 65519355, "block_available": 64511453, "block_used": 1007902, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "02", "epoch": "1726882742", "epoch_int": "1726882742", "date": "2024-09-20", "time": "21:39:02", "iso8601_micro": "2024-09-21T01:39:02.218921Z", "iso8601": "2024-09-21T01:39:02Z", "iso8601_basic": "20240920T213902218921", "iso8601_basic_short": "20240920T213902", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.73, "5m": 0.51, "15m": 0.29}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 27844 1726882742.31108: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882742.31111: _low_level_execute_command(): starting 27844 1726882742.31113: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882741.0443697-27891-220359103952435/ > /dev/null 2>&1 && sleep 0' 27844 1726882742.31330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.31339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.31350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.31365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.31406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.31413: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.31423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.31437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.31445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.31451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.31459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.31473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.31486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.31496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.31500: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.31509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.31577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.31590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.31597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.31735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.33569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882742.33624: stderr chunk (state=3): >>><<< 27844 1726882742.33627: stdout chunk (state=3): >>><<< 27844 1726882742.33671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882742.33674: handler run complete 27844 1726882742.33774: variable 'ansible_facts' from source: unknown 27844 1726882742.33872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.35104: variable 'ansible_facts' from source: unknown 27844 1726882742.35195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.35333: attempt loop complete, returning result 27844 1726882742.35341: _execute() done 27844 1726882742.35347: dumping result to json 27844 1726882742.35383: done dumping result, returning 27844 1726882742.35395: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-efa9-466a-0000000000bf] 27844 1726882742.35407: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000bf ok: [managed_node1] 27844 1726882742.36015: no more pending results, returning what we have 27844 1726882742.36018: results queue empty 27844 1726882742.36019: checking for any_errors_fatal 27844 1726882742.36020: done checking for any_errors_fatal 27844 1726882742.36021: checking for max_fail_percentage 27844 1726882742.36023: done checking for max_fail_percentage 27844 1726882742.36023: checking to see if all hosts have failed and the running result is not ok 27844 1726882742.36024: done checking to see if all hosts have failed 27844 1726882742.36025: getting the remaining hosts for this loop 27844 1726882742.36027: done getting the remaining hosts for this loop 27844 1726882742.36031: getting the next task for host managed_node1 27844 1726882742.36037: done getting next task for host managed_node1 27844 1726882742.36039: ^ task is: TASK: meta (flush_handlers) 27844 1726882742.36041: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882742.36045: getting variables 27844 1726882742.36047: in VariableManager get_vars() 27844 1726882742.36072: Calling all_inventory to load vars for managed_node1 27844 1726882742.36075: Calling groups_inventory to load vars for managed_node1 27844 1726882742.36078: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.36090: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.36092: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.36095: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.36258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.36451: done with get_vars() 27844 1726882742.36462: done getting variables 27844 1726882742.36531: in VariableManager get_vars() 27844 1726882742.36540: Calling all_inventory to load vars for managed_node1 27844 1726882742.36543: Calling groups_inventory to load vars for managed_node1 27844 1726882742.36545: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.36550: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.36552: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.36555: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.37143: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000bf 27844 1726882742.37150: WORKER PROCESS EXITING 27844 1726882742.37172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.37358: done with get_vars() 27844 1726882742.37377: done queuing things up, now waiting for results queue to drain 27844 1726882742.37379: results queue empty 27844 1726882742.37380: checking for any_errors_fatal 27844 1726882742.37382: done checking for any_errors_fatal 27844 1726882742.37383: checking for max_fail_percentage 27844 1726882742.37384: done checking for max_fail_percentage 27844 1726882742.37385: checking to see if all hosts have failed and the running result is not ok 27844 1726882742.37385: done checking to see if all hosts have failed 27844 1726882742.37386: getting the remaining hosts for this loop 27844 1726882742.37387: done getting the remaining hosts for this loop 27844 1726882742.37390: getting the next task for host managed_node1 27844 1726882742.37394: done getting next task for host managed_node1 27844 1726882742.37396: ^ task is: TASK: Include the task 'el_repo_setup.yml' 27844 1726882742.37398: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882742.37400: getting variables 27844 1726882742.37401: in VariableManager get_vars() 27844 1726882742.37408: Calling all_inventory to load vars for managed_node1 27844 1726882742.37410: Calling groups_inventory to load vars for managed_node1 27844 1726882742.37412: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.37417: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.37419: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.37422: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.37551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.37734: done with get_vars() 27844 1726882742.37741: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Friday 20 September 2024 21:39:02 -0400 (0:00:01.430) 0:00:01.454 ****** 27844 1726882742.37820: entering _queue_task() for managed_node1/include_tasks 27844 1726882742.37822: Creating lock for include_tasks 27844 1726882742.38097: worker is 1 (out of 1 available) 27844 1726882742.38110: exiting _queue_task() for managed_node1/include_tasks 27844 1726882742.38122: done queuing things up, now waiting for results queue to drain 27844 1726882742.38124: waiting for pending results... 27844 1726882742.38361: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 27844 1726882742.38447: in run() - task 0e448fcc-3ce9-efa9-466a-000000000006 27844 1726882742.38475: variable 'ansible_search_path' from source: unknown 27844 1726882742.38514: calling self._execute() 27844 1726882742.38595: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882742.38606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882742.38616: variable 'omit' from source: magic vars 27844 1726882742.38715: _execute() done 27844 1726882742.38721: dumping result to json 27844 1726882742.38728: done dumping result, returning 27844 1726882742.38736: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-efa9-466a-000000000006] 27844 1726882742.38746: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000006 27844 1726882742.38876: no more pending results, returning what we have 27844 1726882742.38881: in VariableManager get_vars() 27844 1726882742.38912: Calling all_inventory to load vars for managed_node1 27844 1726882742.38914: Calling groups_inventory to load vars for managed_node1 27844 1726882742.38918: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.38930: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.38934: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.38936: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.39139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.39336: done with get_vars() 27844 1726882742.39342: variable 'ansible_search_path' from source: unknown 27844 1726882742.39356: we have included files to process 27844 1726882742.39357: generating all_blocks data 27844 1726882742.39358: done generating all_blocks data 27844 1726882742.39359: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27844 1726882742.39360: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27844 1726882742.39363: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27844 1726882742.39786: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000006 27844 1726882742.39789: WORKER PROCESS EXITING 27844 1726882742.40199: in VariableManager get_vars() 27844 1726882742.40213: done with get_vars() 27844 1726882742.40224: done processing included file 27844 1726882742.40226: iterating over new_blocks loaded from include file 27844 1726882742.40227: in VariableManager get_vars() 27844 1726882742.40236: done with get_vars() 27844 1726882742.40238: filtering new block on tags 27844 1726882742.40252: done filtering new block on tags 27844 1726882742.40254: in VariableManager get_vars() 27844 1726882742.40266: done with get_vars() 27844 1726882742.40267: filtering new block on tags 27844 1726882742.40283: done filtering new block on tags 27844 1726882742.40285: in VariableManager get_vars() 27844 1726882742.40313: done with get_vars() 27844 1726882742.40315: filtering new block on tags 27844 1726882742.40329: done filtering new block on tags 27844 1726882742.40331: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 27844 1726882742.40336: extending task lists for all hosts with included blocks 27844 1726882742.40388: done extending task lists 27844 1726882742.40389: done processing included files 27844 1726882742.40390: results queue empty 27844 1726882742.40390: checking for any_errors_fatal 27844 1726882742.40392: done checking for any_errors_fatal 27844 1726882742.40393: checking for max_fail_percentage 27844 1726882742.40394: done checking for max_fail_percentage 27844 1726882742.40394: checking to see if all hosts have failed and the running result is not ok 27844 1726882742.40395: done checking to see if all hosts have failed 27844 1726882742.40396: getting the remaining hosts for this loop 27844 1726882742.40397: done getting the remaining hosts for this loop 27844 1726882742.40399: getting the next task for host managed_node1 27844 1726882742.40403: done getting next task for host managed_node1 27844 1726882742.40405: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 27844 1726882742.40407: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882742.40409: getting variables 27844 1726882742.40410: in VariableManager get_vars() 27844 1726882742.40417: Calling all_inventory to load vars for managed_node1 27844 1726882742.40420: Calling groups_inventory to load vars for managed_node1 27844 1726882742.40422: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.40426: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.40428: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.40431: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.40558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.40745: done with get_vars() 27844 1726882742.40752: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:39:02 -0400 (0:00:00.029) 0:00:01.484 ****** 27844 1726882742.40818: entering _queue_task() for managed_node1/setup 27844 1726882742.41026: worker is 1 (out of 1 available) 27844 1726882742.41037: exiting _queue_task() for managed_node1/setup 27844 1726882742.41049: done queuing things up, now waiting for results queue to drain 27844 1726882742.41051: waiting for pending results... 27844 1726882742.41273: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 27844 1726882742.41377: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000d0 27844 1726882742.41398: variable 'ansible_search_path' from source: unknown 27844 1726882742.41406: variable 'ansible_search_path' from source: unknown 27844 1726882742.41441: calling self._execute() 27844 1726882742.41509: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882742.41519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882742.41531: variable 'omit' from source: magic vars 27844 1726882742.41995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882742.44156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882742.44236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882742.44280: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882742.44323: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882742.44354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882742.44439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882742.44476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882742.44509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882742.44559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882742.44583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882742.44758: variable 'ansible_facts' from source: unknown 27844 1726882742.44827: variable 'network_test_required_facts' from source: task vars 27844 1726882742.44867: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 27844 1726882742.44875: when evaluation is False, skipping this task 27844 1726882742.44880: _execute() done 27844 1726882742.44886: dumping result to json 27844 1726882742.44891: done dumping result, returning 27844 1726882742.44900: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-efa9-466a-0000000000d0] 27844 1726882742.44907: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d0 skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 27844 1726882742.45055: no more pending results, returning what we have 27844 1726882742.45058: results queue empty 27844 1726882742.45059: checking for any_errors_fatal 27844 1726882742.45060: done checking for any_errors_fatal 27844 1726882742.45061: checking for max_fail_percentage 27844 1726882742.45063: done checking for max_fail_percentage 27844 1726882742.45065: checking to see if all hosts have failed and the running result is not ok 27844 1726882742.45066: done checking to see if all hosts have failed 27844 1726882742.45067: getting the remaining hosts for this loop 27844 1726882742.45068: done getting the remaining hosts for this loop 27844 1726882742.45071: getting the next task for host managed_node1 27844 1726882742.45080: done getting next task for host managed_node1 27844 1726882742.45083: ^ task is: TASK: Check if system is ostree 27844 1726882742.45085: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882742.45089: getting variables 27844 1726882742.45091: in VariableManager get_vars() 27844 1726882742.45115: Calling all_inventory to load vars for managed_node1 27844 1726882742.45117: Calling groups_inventory to load vars for managed_node1 27844 1726882742.45120: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882742.45130: Calling all_plugins_play to load vars for managed_node1 27844 1726882742.45133: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882742.45136: Calling groups_plugins_play to load vars for managed_node1 27844 1726882742.45319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882742.45525: done with get_vars() 27844 1726882742.45533: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:39:02 -0400 (0:00:00.047) 0:00:01.532 ****** 27844 1726882742.45619: entering _queue_task() for managed_node1/stat 27844 1726882742.45636: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d0 27844 1726882742.45641: WORKER PROCESS EXITING 27844 1726882742.46033: worker is 1 (out of 1 available) 27844 1726882742.46043: exiting _queue_task() for managed_node1/stat 27844 1726882742.46054: done queuing things up, now waiting for results queue to drain 27844 1726882742.46056: waiting for pending results... 27844 1726882742.46270: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 27844 1726882742.46373: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000d2 27844 1726882742.46396: variable 'ansible_search_path' from source: unknown 27844 1726882742.46404: variable 'ansible_search_path' from source: unknown 27844 1726882742.46442: calling self._execute() 27844 1726882742.46513: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882742.46524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882742.46538: variable 'omit' from source: magic vars 27844 1726882742.46987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882742.47222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882742.47273: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882742.47310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882742.47366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882742.47447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882742.47481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882742.47513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882742.47545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882742.47661: Evaluated conditional (not __network_is_ostree is defined): True 27844 1726882742.47676: variable 'omit' from source: magic vars 27844 1726882742.47713: variable 'omit' from source: magic vars 27844 1726882742.47752: variable 'omit' from source: magic vars 27844 1726882742.47781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882742.47814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882742.47836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882742.47855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882742.47871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882742.47902: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882742.47914: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882742.47921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882742.48014: Set connection var ansible_shell_type to sh 27844 1726882742.48023: Set connection var ansible_connection to ssh 27844 1726882742.48033: Set connection var ansible_pipelining to False 27844 1726882742.48042: Set connection var ansible_timeout to 10 27844 1726882742.48050: Set connection var ansible_shell_executable to /bin/sh 27844 1726882742.48058: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882742.48087: variable 'ansible_shell_executable' from source: unknown 27844 1726882742.48095: variable 'ansible_connection' from source: unknown 27844 1726882742.48102: variable 'ansible_module_compression' from source: unknown 27844 1726882742.48108: variable 'ansible_shell_type' from source: unknown 27844 1726882742.48115: variable 'ansible_shell_executable' from source: unknown 27844 1726882742.48121: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882742.48133: variable 'ansible_pipelining' from source: unknown 27844 1726882742.48140: variable 'ansible_timeout' from source: unknown 27844 1726882742.48147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882742.48291: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882742.48306: variable 'omit' from source: magic vars 27844 1726882742.48315: starting attempt loop 27844 1726882742.48321: running the handler 27844 1726882742.48337: _low_level_execute_command(): starting 27844 1726882742.48353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882742.49096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.49115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.49131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.49150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.49196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.49211: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.49228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.49247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.49260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.49275: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.49289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.49303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.49319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.49334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.49346: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.49361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.49440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.49458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.49477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.49616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.51286: stdout chunk (state=3): >>>/root <<< 27844 1726882742.51386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882742.51457: stderr chunk (state=3): >>><<< 27844 1726882742.51461: stdout chunk (state=3): >>><<< 27844 1726882742.51561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882742.51577: _low_level_execute_command(): starting 27844 1726882742.51581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435 `" && echo ansible-tmp-1726882742.5148308-27958-265438637174435="` echo /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435 `" ) && sleep 0' 27844 1726882742.52148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.52161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.52183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.52201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.52241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.52252: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.52270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.52289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.52301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.52312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.52324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.52338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.52355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.52373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.52385: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.52399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.52470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.52488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.52502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.52672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.54520: stdout chunk (state=3): >>>ansible-tmp-1726882742.5148308-27958-265438637174435=/root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435 <<< 27844 1726882742.54627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882742.54696: stderr chunk (state=3): >>><<< 27844 1726882742.54699: stdout chunk (state=3): >>><<< 27844 1726882742.54771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882742.5148308-27958-265438637174435=/root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882742.54774: variable 'ansible_module_compression' from source: unknown 27844 1726882742.54874: ANSIBALLZ: Using lock for stat 27844 1726882742.54877: ANSIBALLZ: Acquiring lock 27844 1726882742.54879: ANSIBALLZ: Lock acquired: 139916606271904 27844 1726882742.54881: ANSIBALLZ: Creating module 27844 1726882742.67037: ANSIBALLZ: Writing module into payload 27844 1726882742.67161: ANSIBALLZ: Writing module 27844 1726882742.67191: ANSIBALLZ: Renaming module 27844 1726882742.67202: ANSIBALLZ: Done creating module 27844 1726882742.67222: variable 'ansible_facts' from source: unknown 27844 1726882742.67302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/AnsiballZ_stat.py 27844 1726882742.67456: Sending initial data 27844 1726882742.67460: Sent initial data (153 bytes) 27844 1726882742.68487: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.68502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.68518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.68538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.68583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.68594: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.68605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.68620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.68632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.68643: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.68655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.68671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.68691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.68702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.68711: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.68724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.68803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.68819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.68833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.68961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.70809: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882742.70901: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882742.71006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmptvzfatzr /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/AnsiballZ_stat.py <<< 27844 1726882742.71097: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882742.72396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882742.72639: stderr chunk (state=3): >>><<< 27844 1726882742.72642: stdout chunk (state=3): >>><<< 27844 1726882742.72644: done transferring module to remote 27844 1726882742.72646: _low_level_execute_command(): starting 27844 1726882742.72648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/ /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/AnsiballZ_stat.py && sleep 0' 27844 1726882742.74330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.74345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.74360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.74382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.74423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.74436: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.74451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.74474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.74487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.74499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.74511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.74525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.74541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.74553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.74567: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.74582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.74655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.74682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.74698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.74819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.76592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882742.76672: stderr chunk (state=3): >>><<< 27844 1726882742.76683: stdout chunk (state=3): >>><<< 27844 1726882742.76784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882742.76788: _low_level_execute_command(): starting 27844 1726882742.76791: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/AnsiballZ_stat.py && sleep 0' 27844 1726882742.78129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.78180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.78205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.78229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.78272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.78328: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.78343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.78365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.78445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.78458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.78473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.78488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.78504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.78548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.78561: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.78580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.78778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.78801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.78818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.78948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882742.80906: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 27844 1726882742.80910: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 27844 1726882742.80972: stdout chunk (state=3): >>>import '_io' # <<< 27844 1726882742.80975: stdout chunk (state=3): >>>import 'marshal' # <<< 27844 1726882742.81006: stdout chunk (state=3): >>>import 'posix' # <<< 27844 1726882742.81039: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 27844 1726882742.81043: stdout chunk (state=3): >>># installing zipimport hook <<< 27844 1726882742.81086: stdout chunk (state=3): >>>import 'time' # <<< 27844 1726882742.81089: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 27844 1726882742.81139: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.81174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 27844 1726882742.81189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 27844 1726882742.81204: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3dc0> <<< 27844 1726882742.81245: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 27844 1726882742.81271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3b20> <<< 27844 1726882742.81307: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 27844 1726882742.81331: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3ac0> <<< 27844 1726882742.81358: stdout chunk (state=3): >>>import '_signal' # <<< 27844 1726882742.81375: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98490> <<< 27844 1726882742.81418: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 27844 1726882742.81437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 27844 1726882742.81460: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98940> <<< 27844 1726882742.81465: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98670> <<< 27844 1726882742.81497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 27844 1726882742.81501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 27844 1726882742.81525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 27844 1726882742.81553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 27844 1726882742.81579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 27844 1726882742.81598: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f190> <<< 27844 1726882742.81620: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 27844 1726882742.81638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 27844 1726882742.81700: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f220> <<< 27844 1726882742.81731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 27844 1726882742.81776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f940> <<< 27844 1726882742.81807: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbb0880> <<< 27844 1726882742.81822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db48d90> <<< 27844 1726882742.81893: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 27844 1726882742.81897: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db72d90> <<< 27844 1726882742.81938: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98970> <<< 27844 1726882742.81973: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27844 1726882742.82168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 27844 1726882742.82256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 27844 1726882742.82277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db13eb0> <<< 27844 1726882742.82335: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db16f40> <<< 27844 1726882742.82362: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 27844 1726882742.82380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 27844 1726882742.82422: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 27844 1726882742.82426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 27844 1726882742.82453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db0c610> <<< 27844 1726882742.82484: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db12640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db13370> <<< 27844 1726882742.82497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 27844 1726882742.82575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 27844 1726882742.82588: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 27844 1726882742.82623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.82641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 27844 1726882742.82683: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8da94df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da948e0> import 'itertools' # <<< 27844 1726882742.82725: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94ee0> <<< 27844 1726882742.82728: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 27844 1726882742.82776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94fa0> <<< 27844 1726882742.82825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94eb0> <<< 27844 1726882742.82829: stdout chunk (state=3): >>>import '_collections' # <<< 27844 1726882742.82894: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daeed60> <<< 27844 1726882742.82898: stdout chunk (state=3): >>>import '_functools' # <<< 27844 1726882742.82911: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dae7640> <<< 27844 1726882742.82977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 27844 1726882742.82981: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dafa6a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db1ae20> <<< 27844 1726882742.83024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 27844 1726882742.83027: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8daa7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daee280> <<< 27844 1726882742.83074: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8dafa2b0> <<< 27844 1726882742.83083: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db209d0> <<< 27844 1726882742.83132: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 27844 1726882742.83156: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.83159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 27844 1726882742.83189: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7dc0> <<< 27844 1726882742.83218: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7d30> <<< 27844 1726882742.83246: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 27844 1726882742.83258: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 27844 1726882742.83279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 27844 1726882742.83330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 27844 1726882742.83364: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da7a3a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 27844 1726882742.83384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 27844 1726882742.83414: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da7a490> <<< 27844 1726882742.83544: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daaefd0> <<< 27844 1726882742.83581: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9a60> <<< 27844 1726882742.83607: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 27844 1726882742.83622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 27844 1726882742.83650: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 27844 1726882742.83685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 27844 1726882742.83698: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7881f0> <<< 27844 1726882742.83723: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da65b80> <<< 27844 1726882742.83780: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db20040> <<< 27844 1726882742.83802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 27844 1726882742.83827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 27844 1726882742.83852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d79ab20> <<< 27844 1726882742.83866: stdout chunk (state=3): >>>import 'errno' # <<< 27844 1726882742.83907: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d79ae50> <<< 27844 1726882742.83931: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 27844 1726882742.83950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7ac760> <<< 27844 1726882742.83970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 27844 1726882742.84005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 27844 1726882742.84028: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7acca0> <<< 27844 1726882742.84061: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7443d0> <<< 27844 1726882742.84096: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d79af40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 27844 1726882742.84113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 27844 1726882742.84153: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7552b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7ac5e0> import 'pwd' # <<< 27844 1726882742.84190: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d755370> <<< 27844 1726882742.84237: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7a00> <<< 27844 1726882742.84260: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 27844 1726882742.84288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 27844 1726882742.84301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 27844 1726882742.84344: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7706d0> <<< 27844 1726882742.84378: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882742.84402: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7709a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770790> <<< 27844 1726882742.84414: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d770880> <<< 27844 1726882742.84441: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 27844 1726882742.84635: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d770cd0> <<< 27844 1726882742.84682: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d77d220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770910> <<< 27844 1726882742.84702: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d764a60> <<< 27844 1726882742.84724: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa75e0> <<< 27844 1726882742.84735: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 27844 1726882742.84797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 27844 1726882742.84823: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770ac0> <<< 27844 1726882742.84927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 27844 1726882742.84938: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fea8d68b6a0> <<< 27844 1726882742.85095: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip' # zipimport: zlib available <<< 27844 1726882742.85182: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.85217: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/__init__.py <<< 27844 1726882742.85239: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.85250: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 27844 1726882742.86470: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.87422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5897f0> <<< 27844 1726882742.87446: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.87484: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 27844 1726882742.87503: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 27844 1726882742.87517: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d618760> <<< 27844 1726882742.87550: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618640> <<< 27844 1726882742.87590: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618370> <<< 27844 1726882742.87606: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 27844 1726882742.87655: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618190> import 'atexit' # <<< 27844 1726882742.87692: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d6183d0> <<< 27844 1726882742.87705: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 27844 1726882742.87732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 27844 1726882742.87767: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d6187c0> <<< 27844 1726882742.87795: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 27844 1726882742.87830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 27844 1726882742.87834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 27844 1726882742.87866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 27844 1726882742.87870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 27844 1726882742.87943: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfed7f0> <<< 27844 1726882742.87992: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882742.88013: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfedb80> <<< 27844 1726882742.88037: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfed9d0> <<< 27844 1726882742.88040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 27844 1726882742.88057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 27844 1726882742.88086: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d528af0> <<< 27844 1726882742.88101: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d612d60> <<< 27844 1726882742.88278: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618520> <<< 27844 1726882742.88309: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 27844 1726882742.88345: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d612190> <<< 27844 1726882742.88359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 27844 1726882742.88403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 27844 1726882742.88421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 27844 1726882742.88424: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d585ac0> <<< 27844 1726882742.88530: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5bbe80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5bb8b0> <<< 27844 1726882742.88534: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5222e0> <<< 27844 1726882742.88580: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d5bb9a0> <<< 27844 1726882742.88590: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5e9cd0> <<< 27844 1726882742.88609: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 27844 1726882742.88623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 27844 1726882742.88646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 27844 1726882742.88730: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfcea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5f1e50> <<< 27844 1726882742.88750: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 27844 1726882742.88825: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882742.88828: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfdd0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5f1f10> <<< 27844 1726882742.88842: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 27844 1726882742.88892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.88906: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 27844 1726882742.88962: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5be6d0> <<< 27844 1726882742.89096: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfdd0d0> <<< 27844 1726882742.89187: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfda550> <<< 27844 1726882742.89220: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfda610> <<< 27844 1726882742.89274: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfd9c40> <<< 27844 1726882742.89278: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5e9c10> <<< 27844 1726882742.89306: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 27844 1726882742.89319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 27844 1726882742.89366: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d579b20> <<< 27844 1726882742.89587: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 27844 1726882742.89602: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d578940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfd0820> <<< 27844 1726882742.89624: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d579580> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5b2ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 27844 1726882742.89638: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.89714: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.89804: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27844 1726882742.89836: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 27844 1726882742.89852: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 27844 1726882742.89946: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.90046: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.90493: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.90973: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 27844 1726882742.90977: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 27844 1726882742.90996: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.91051: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cb9fd90> <<< 27844 1726882742.91117: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 27844 1726882742.91140: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfaa580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf9bdf0> <<< 27844 1726882742.91202: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 27844 1726882742.91212: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.91233: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 27844 1726882742.91353: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.91488: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 27844 1726882742.91507: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d56f9a0> # zipimport: zlib available <<< 27844 1726882742.91897: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92255: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92312: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92382: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 27844 1726882742.92385: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92415: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92453: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 27844 1726882742.92456: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92503: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92589: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 27844 1726882742.92621: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 27844 1726882742.92624: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92648: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.92689: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 27844 1726882742.92870: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 27844 1726882742.93094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 27844 1726882742.93097: stdout chunk (state=3): >>>import '_ast' # <<< 27844 1726882742.93167: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cb9f640> # zipimport: zlib available <<< 27844 1726882742.93231: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93302: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 27844 1726882742.93321: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 27844 1726882742.93365: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93404: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 27844 1726882742.93407: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93439: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93479: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93569: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93626: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 27844 1726882742.93651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 27844 1726882742.93718: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d603be0> <<< 27844 1726882742.93751: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cb72f10> <<< 27844 1726882742.93792: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 27844 1726882742.93795: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93913: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93965: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.93995: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.94044: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 27844 1726882742.94056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 27844 1726882742.94095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 27844 1726882742.94130: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 27844 1726882742.94133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 27844 1726882742.94221: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfb7bb0> <<< 27844 1726882742.94257: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf9c340> <<< 27844 1726882742.94320: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf987f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 27844 1726882742.94357: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.94374: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 27844 1726882742.94454: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 27844 1726882742.94485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 27844 1726882742.94488: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.94594: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.94770: stdout chunk (state=3): >>># zipimport: zlib available <<< 27844 1726882742.94909: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 27844 1726882742.95172: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 27844 1726882742.95176: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 27844 1726882742.95205: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external <<< 27844 1726882742.95222: stdout chunk (state=3): >>># cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword <<< 27844 1726882742.95255: stdout chunk (state=3): >>># destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 27844 1726882742.95304: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 27844 1726882742.95311: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 27844 1726882742.95502: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 27844 1726882742.95551: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 27844 1726882742.95576: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 27844 1726882742.95612: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime <<< 27844 1726882742.95615: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 27844 1726882742.95734: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 27844 1726882742.95752: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 27844 1726882742.95823: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc <<< 27844 1726882742.95866: stdout chunk (state=3): >>># cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 27844 1726882742.95882: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 27844 1726882742.96052: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq <<< 27844 1726882742.96097: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator <<< 27844 1726882742.96101: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 27844 1726882742.96132: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 27844 1726882742.96555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882742.96558: stdout chunk (state=3): >>><<< 27844 1726882742.96560: stderr chunk (state=3): >>><<< 27844 1726882742.96728: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbf3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dbb0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db13eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db16f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db0c610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db12640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db13370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8da94df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da948e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94ee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94fa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da94eb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daeed60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dae7640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8dafa6a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db1ae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8daa7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daee280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8dafa2b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db209d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da7a3a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da7a490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daaefd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7881f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8da65b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa9ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8db20040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d79ab20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d79ae50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7ac760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7acca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7443d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d79af40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7552b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d7ac5e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d755370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa7a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7706d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d7709a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d770880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d770cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d77d220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d764a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8daa75e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d770ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fea8d68b6a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5897f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d618760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d6183d0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d6187c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfed7f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfedb80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfed9d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d528af0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d612d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d618520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d612190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d585ac0> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5bbe80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5bb8b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5222e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d5bb9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5e9cd0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfcea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5f1e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfdd0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5f1f10> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5be6d0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfdd0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfda550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfda610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cfd9c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5e9c10> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d579b20> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d578940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfd0820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d579580> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d5b2ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8cb9fd90> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfaa580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf9bdf0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8d56f9a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cb9f640> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fea8d603be0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cb72f10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cfb7bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf9c340> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fea8cf987f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_rs45y4lo/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 27844 1726882742.97375: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882742.97378: _low_level_execute_command(): starting 27844 1726882742.97380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882742.5148308-27958-265438637174435/ > /dev/null 2>&1 && sleep 0' 27844 1726882742.98812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882742.98894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.98910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.98927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.98973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.98988: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882742.99004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.99021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882742.99032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882742.99062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882742.99081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882742.99094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882742.99113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882742.99173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882742.99185: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882742.99199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882742.99339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882742.99388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882742.99437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882742.99627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882743.01524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882743.01554: stderr chunk (state=3): >>><<< 27844 1726882743.01557: stdout chunk (state=3): >>><<< 27844 1726882743.01690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882743.01693: handler run complete 27844 1726882743.01696: attempt loop complete, returning result 27844 1726882743.01698: _execute() done 27844 1726882743.01700: dumping result to json 27844 1726882743.02001: done dumping result, returning 27844 1726882743.02005: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-efa9-466a-0000000000d2] 27844 1726882743.02007: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d2 27844 1726882743.02079: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d2 27844 1726882743.02083: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 27844 1726882743.02144: no more pending results, returning what we have 27844 1726882743.02147: results queue empty 27844 1726882743.02148: checking for any_errors_fatal 27844 1726882743.02153: done checking for any_errors_fatal 27844 1726882743.02154: checking for max_fail_percentage 27844 1726882743.02156: done checking for max_fail_percentage 27844 1726882743.02157: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.02158: done checking to see if all hosts have failed 27844 1726882743.02159: getting the remaining hosts for this loop 27844 1726882743.02160: done getting the remaining hosts for this loop 27844 1726882743.02165: getting the next task for host managed_node1 27844 1726882743.02171: done getting next task for host managed_node1 27844 1726882743.02174: ^ task is: TASK: Set flag to indicate system is ostree 27844 1726882743.02177: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.02180: getting variables 27844 1726882743.02182: in VariableManager get_vars() 27844 1726882743.02210: Calling all_inventory to load vars for managed_node1 27844 1726882743.02213: Calling groups_inventory to load vars for managed_node1 27844 1726882743.02216: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.02226: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.02229: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.02231: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.02404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.02618: done with get_vars() 27844 1726882743.02628: done getting variables 27844 1726882743.02735: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:39:03 -0400 (0:00:00.571) 0:00:02.104 ****** 27844 1726882743.02873: entering _queue_task() for managed_node1/set_fact 27844 1726882743.02875: Creating lock for set_fact 27844 1726882743.03184: worker is 1 (out of 1 available) 27844 1726882743.03195: exiting _queue_task() for managed_node1/set_fact 27844 1726882743.03206: done queuing things up, now waiting for results queue to drain 27844 1726882743.03208: waiting for pending results... 27844 1726882743.03460: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 27844 1726882743.03572: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000d3 27844 1726882743.03592: variable 'ansible_search_path' from source: unknown 27844 1726882743.03601: variable 'ansible_search_path' from source: unknown 27844 1726882743.03642: calling self._execute() 27844 1726882743.03717: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.03727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.03738: variable 'omit' from source: magic vars 27844 1726882743.04238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882743.04471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882743.04514: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882743.04554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882743.04597: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882743.04691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882743.04721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882743.04757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882743.04793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882743.04915: Evaluated conditional (not __network_is_ostree is defined): True 27844 1726882743.04925: variable 'omit' from source: magic vars 27844 1726882743.04971: variable 'omit' from source: magic vars 27844 1726882743.05707: variable '__ostree_booted_stat' from source: set_fact 27844 1726882743.05758: variable 'omit' from source: magic vars 27844 1726882743.05797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882743.05828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882743.05852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882743.05879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.05896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.05928: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882743.05936: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.05943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.06046: Set connection var ansible_shell_type to sh 27844 1726882743.06054: Set connection var ansible_connection to ssh 27844 1726882743.06069: Set connection var ansible_pipelining to False 27844 1726882743.06081: Set connection var ansible_timeout to 10 27844 1726882743.06090: Set connection var ansible_shell_executable to /bin/sh 27844 1726882743.06098: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882743.06131: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.06138: variable 'ansible_connection' from source: unknown 27844 1726882743.06145: variable 'ansible_module_compression' from source: unknown 27844 1726882743.06151: variable 'ansible_shell_type' from source: unknown 27844 1726882743.06156: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.06162: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.06175: variable 'ansible_pipelining' from source: unknown 27844 1726882743.06182: variable 'ansible_timeout' from source: unknown 27844 1726882743.06190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.06323: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882743.06456: variable 'omit' from source: magic vars 27844 1726882743.06469: starting attempt loop 27844 1726882743.06559: running the handler 27844 1726882743.06577: handler run complete 27844 1726882743.06590: attempt loop complete, returning result 27844 1726882743.06596: _execute() done 27844 1726882743.06601: dumping result to json 27844 1726882743.06606: done dumping result, returning 27844 1726882743.06616: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-efa9-466a-0000000000d3] 27844 1726882743.06625: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d3 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 27844 1726882743.06757: no more pending results, returning what we have 27844 1726882743.06760: results queue empty 27844 1726882743.06761: checking for any_errors_fatal 27844 1726882743.06773: done checking for any_errors_fatal 27844 1726882743.06774: checking for max_fail_percentage 27844 1726882743.06776: done checking for max_fail_percentage 27844 1726882743.06776: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.06777: done checking to see if all hosts have failed 27844 1726882743.06778: getting the remaining hosts for this loop 27844 1726882743.06780: done getting the remaining hosts for this loop 27844 1726882743.06783: getting the next task for host managed_node1 27844 1726882743.06792: done getting next task for host managed_node1 27844 1726882743.06795: ^ task is: TASK: Fix CentOS6 Base repo 27844 1726882743.06798: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.06802: getting variables 27844 1726882743.06804: in VariableManager get_vars() 27844 1726882743.06833: Calling all_inventory to load vars for managed_node1 27844 1726882743.06837: Calling groups_inventory to load vars for managed_node1 27844 1726882743.06840: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.06851: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.06853: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.06856: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.07061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.07262: done with get_vars() 27844 1726882743.07275: done getting variables 27844 1726882743.07885: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d3 27844 1726882743.07889: WORKER PROCESS EXITING 27844 1726882743.07892: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:39:03 -0400 (0:00:00.051) 0:00:02.155 ****** 27844 1726882743.07920: entering _queue_task() for managed_node1/copy 27844 1726882743.08170: worker is 1 (out of 1 available) 27844 1726882743.08182: exiting _queue_task() for managed_node1/copy 27844 1726882743.08195: done queuing things up, now waiting for results queue to drain 27844 1726882743.08197: waiting for pending results... 27844 1726882743.08440: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 27844 1726882743.08551: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000d5 27844 1726882743.08574: variable 'ansible_search_path' from source: unknown 27844 1726882743.08581: variable 'ansible_search_path' from source: unknown 27844 1726882743.08621: calling self._execute() 27844 1726882743.08708: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.08719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.08731: variable 'omit' from source: magic vars 27844 1726882743.09204: variable 'ansible_distribution' from source: facts 27844 1726882743.09230: Evaluated conditional (ansible_distribution == 'CentOS'): True 27844 1726882743.09352: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.09368: Evaluated conditional (ansible_distribution_major_version == '6'): False 27844 1726882743.09376: when evaluation is False, skipping this task 27844 1726882743.09383: _execute() done 27844 1726882743.09389: dumping result to json 27844 1726882743.09400: done dumping result, returning 27844 1726882743.09410: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-efa9-466a-0000000000d5] 27844 1726882743.09420: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27844 1726882743.09580: no more pending results, returning what we have 27844 1726882743.09584: results queue empty 27844 1726882743.09585: checking for any_errors_fatal 27844 1726882743.09591: done checking for any_errors_fatal 27844 1726882743.09592: checking for max_fail_percentage 27844 1726882743.09594: done checking for max_fail_percentage 27844 1726882743.09594: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.09595: done checking to see if all hosts have failed 27844 1726882743.09596: getting the remaining hosts for this loop 27844 1726882743.09598: done getting the remaining hosts for this loop 27844 1726882743.09602: getting the next task for host managed_node1 27844 1726882743.09608: done getting next task for host managed_node1 27844 1726882743.09611: ^ task is: TASK: Include the task 'enable_epel.yml' 27844 1726882743.09614: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.09618: getting variables 27844 1726882743.09620: in VariableManager get_vars() 27844 1726882743.09650: Calling all_inventory to load vars for managed_node1 27844 1726882743.09653: Calling groups_inventory to load vars for managed_node1 27844 1726882743.09657: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.09675: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.09679: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.09683: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.09867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.10103: done with get_vars() 27844 1726882743.10113: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:39:03 -0400 (0:00:00.023) 0:00:02.178 ****** 27844 1726882743.10233: entering _queue_task() for managed_node1/include_tasks 27844 1726882743.10250: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d5 27844 1726882743.10256: WORKER PROCESS EXITING 27844 1726882743.10675: worker is 1 (out of 1 available) 27844 1726882743.10685: exiting _queue_task() for managed_node1/include_tasks 27844 1726882743.10696: done queuing things up, now waiting for results queue to drain 27844 1726882743.10697: waiting for pending results... 27844 1726882743.10919: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 27844 1726882743.11013: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000d6 27844 1726882743.11032: variable 'ansible_search_path' from source: unknown 27844 1726882743.11042: variable 'ansible_search_path' from source: unknown 27844 1726882743.11086: calling self._execute() 27844 1726882743.11168: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.11180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.11193: variable 'omit' from source: magic vars 27844 1726882743.11781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882743.16954: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882743.17346: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882743.17454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882743.17547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882743.18737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882743.18831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882743.18878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882743.18910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882743.18970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882743.18990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882743.19123: variable '__network_is_ostree' from source: set_fact 27844 1726882743.19146: Evaluated conditional (not __network_is_ostree | d(false)): True 27844 1726882743.19161: _execute() done 27844 1726882743.19183: dumping result to json 27844 1726882743.19191: done dumping result, returning 27844 1726882743.19202: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-efa9-466a-0000000000d6] 27844 1726882743.19211: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d6 27844 1726882743.19598: no more pending results, returning what we have 27844 1726882743.19605: in VariableManager get_vars() 27844 1726882743.19640: Calling all_inventory to load vars for managed_node1 27844 1726882743.19643: Calling groups_inventory to load vars for managed_node1 27844 1726882743.19647: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.19659: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.19662: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.19672: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.19916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.20529: done with get_vars() 27844 1726882743.20536: variable 'ansible_search_path' from source: unknown 27844 1726882743.20537: variable 'ansible_search_path' from source: unknown 27844 1726882743.20579: we have included files to process 27844 1726882743.20581: generating all_blocks data 27844 1726882743.20582: done generating all_blocks data 27844 1726882743.20595: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27844 1726882743.20596: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27844 1726882743.20599: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27844 1726882743.21216: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000d6 27844 1726882743.21220: WORKER PROCESS EXITING 27844 1726882743.21772: done processing included file 27844 1726882743.21775: iterating over new_blocks loaded from include file 27844 1726882743.21777: in VariableManager get_vars() 27844 1726882743.21788: done with get_vars() 27844 1726882743.21790: filtering new block on tags 27844 1726882743.21819: done filtering new block on tags 27844 1726882743.21822: in VariableManager get_vars() 27844 1726882743.21832: done with get_vars() 27844 1726882743.21834: filtering new block on tags 27844 1726882743.21845: done filtering new block on tags 27844 1726882743.21847: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 27844 1726882743.21852: extending task lists for all hosts with included blocks 27844 1726882743.22081: done extending task lists 27844 1726882743.22082: done processing included files 27844 1726882743.22083: results queue empty 27844 1726882743.22084: checking for any_errors_fatal 27844 1726882743.22087: done checking for any_errors_fatal 27844 1726882743.22087: checking for max_fail_percentage 27844 1726882743.22089: done checking for max_fail_percentage 27844 1726882743.22089: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.22090: done checking to see if all hosts have failed 27844 1726882743.22091: getting the remaining hosts for this loop 27844 1726882743.22092: done getting the remaining hosts for this loop 27844 1726882743.22095: getting the next task for host managed_node1 27844 1726882743.22099: done getting next task for host managed_node1 27844 1726882743.22101: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 27844 1726882743.22104: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.22106: getting variables 27844 1726882743.22107: in VariableManager get_vars() 27844 1726882743.22319: Calling all_inventory to load vars for managed_node1 27844 1726882743.22322: Calling groups_inventory to load vars for managed_node1 27844 1726882743.22324: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.22330: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.22336: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.22339: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.22596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.22811: done with get_vars() 27844 1726882743.22819: done getting variables 27844 1726882743.22892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 27844 1726882743.23136: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:39:03 -0400 (0:00:00.129) 0:00:02.308 ****** 27844 1726882743.23185: entering _queue_task() for managed_node1/command 27844 1726882743.23187: Creating lock for command 27844 1726882743.23473: worker is 1 (out of 1 available) 27844 1726882743.23486: exiting _queue_task() for managed_node1/command 27844 1726882743.23497: done queuing things up, now waiting for results queue to drain 27844 1726882743.23499: waiting for pending results... 27844 1726882743.23761: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 27844 1726882743.23885: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000f0 27844 1726882743.23902: variable 'ansible_search_path' from source: unknown 27844 1726882743.23909: variable 'ansible_search_path' from source: unknown 27844 1726882743.23957: calling self._execute() 27844 1726882743.24033: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.24046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.24070: variable 'omit' from source: magic vars 27844 1726882743.24430: variable 'ansible_distribution' from source: facts 27844 1726882743.24446: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27844 1726882743.24586: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.24604: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27844 1726882743.24615: when evaluation is False, skipping this task 27844 1726882743.24621: _execute() done 27844 1726882743.24627: dumping result to json 27844 1726882743.24635: done dumping result, returning 27844 1726882743.24645: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-efa9-466a-0000000000f0] 27844 1726882743.24654: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27844 1726882743.24814: no more pending results, returning what we have 27844 1726882743.24819: results queue empty 27844 1726882743.24820: checking for any_errors_fatal 27844 1726882743.24822: done checking for any_errors_fatal 27844 1726882743.24822: checking for max_fail_percentage 27844 1726882743.24824: done checking for max_fail_percentage 27844 1726882743.24825: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.24826: done checking to see if all hosts have failed 27844 1726882743.24827: getting the remaining hosts for this loop 27844 1726882743.24828: done getting the remaining hosts for this loop 27844 1726882743.24831: getting the next task for host managed_node1 27844 1726882743.24838: done getting next task for host managed_node1 27844 1726882743.24840: ^ task is: TASK: Install yum-utils package 27844 1726882743.24844: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.24847: getting variables 27844 1726882743.24849: in VariableManager get_vars() 27844 1726882743.24881: Calling all_inventory to load vars for managed_node1 27844 1726882743.24884: Calling groups_inventory to load vars for managed_node1 27844 1726882743.24889: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.24902: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.24905: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.24908: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.25128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.25357: done with get_vars() 27844 1726882743.25369: done getting variables 27844 1726882743.25542: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f0 27844 1726882743.25545: WORKER PROCESS EXITING 27844 1726882743.25601: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:39:03 -0400 (0:00:00.024) 0:00:02.332 ****** 27844 1726882743.25628: entering _queue_task() for managed_node1/package 27844 1726882743.25630: Creating lock for package 27844 1726882743.26034: worker is 1 (out of 1 available) 27844 1726882743.26044: exiting _queue_task() for managed_node1/package 27844 1726882743.26055: done queuing things up, now waiting for results queue to drain 27844 1726882743.26057: waiting for pending results... 27844 1726882743.26773: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 27844 1726882743.26893: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000f1 27844 1726882743.26911: variable 'ansible_search_path' from source: unknown 27844 1726882743.26918: variable 'ansible_search_path' from source: unknown 27844 1726882743.26966: calling self._execute() 27844 1726882743.27037: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.27054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.27071: variable 'omit' from source: magic vars 27844 1726882743.27425: variable 'ansible_distribution' from source: facts 27844 1726882743.27442: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27844 1726882743.27576: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.27590: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27844 1726882743.27600: when evaluation is False, skipping this task 27844 1726882743.27607: _execute() done 27844 1726882743.27614: dumping result to json 27844 1726882743.27621: done dumping result, returning 27844 1726882743.27630: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-efa9-466a-0000000000f1] 27844 1726882743.27639: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f1 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27844 1726882743.27783: no more pending results, returning what we have 27844 1726882743.27787: results queue empty 27844 1726882743.27788: checking for any_errors_fatal 27844 1726882743.27797: done checking for any_errors_fatal 27844 1726882743.27797: checking for max_fail_percentage 27844 1726882743.27799: done checking for max_fail_percentage 27844 1726882743.27800: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.27801: done checking to see if all hosts have failed 27844 1726882743.27802: getting the remaining hosts for this loop 27844 1726882743.27803: done getting the remaining hosts for this loop 27844 1726882743.27806: getting the next task for host managed_node1 27844 1726882743.27813: done getting next task for host managed_node1 27844 1726882743.27815: ^ task is: TASK: Enable EPEL 7 27844 1726882743.27819: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.27822: getting variables 27844 1726882743.27824: in VariableManager get_vars() 27844 1726882743.27852: Calling all_inventory to load vars for managed_node1 27844 1726882743.27855: Calling groups_inventory to load vars for managed_node1 27844 1726882743.27858: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.27873: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.27877: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.27880: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.28050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.28250: done with get_vars() 27844 1726882743.28259: done getting variables 27844 1726882743.28328: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:39:03 -0400 (0:00:00.027) 0:00:02.360 ****** 27844 1726882743.28357: entering _queue_task() for managed_node1/command 27844 1726882743.28531: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f1 27844 1726882743.28534: WORKER PROCESS EXITING 27844 1726882743.28818: worker is 1 (out of 1 available) 27844 1726882743.28828: exiting _queue_task() for managed_node1/command 27844 1726882743.28839: done queuing things up, now waiting for results queue to drain 27844 1726882743.28840: waiting for pending results... 27844 1726882743.29060: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 27844 1726882743.29162: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000f2 27844 1726882743.29187: variable 'ansible_search_path' from source: unknown 27844 1726882743.29194: variable 'ansible_search_path' from source: unknown 27844 1726882743.29237: calling self._execute() 27844 1726882743.29314: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.29325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.29337: variable 'omit' from source: magic vars 27844 1726882743.30073: variable 'ansible_distribution' from source: facts 27844 1726882743.30091: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27844 1726882743.30319: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.30331: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27844 1726882743.30339: when evaluation is False, skipping this task 27844 1726882743.30346: _execute() done 27844 1726882743.30353: dumping result to json 27844 1726882743.30365: done dumping result, returning 27844 1726882743.30379: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-efa9-466a-0000000000f2] 27844 1726882743.30388: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27844 1726882743.30530: no more pending results, returning what we have 27844 1726882743.30534: results queue empty 27844 1726882743.30535: checking for any_errors_fatal 27844 1726882743.30541: done checking for any_errors_fatal 27844 1726882743.30542: checking for max_fail_percentage 27844 1726882743.30544: done checking for max_fail_percentage 27844 1726882743.30545: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.30546: done checking to see if all hosts have failed 27844 1726882743.30546: getting the remaining hosts for this loop 27844 1726882743.30548: done getting the remaining hosts for this loop 27844 1726882743.30551: getting the next task for host managed_node1 27844 1726882743.30558: done getting next task for host managed_node1 27844 1726882743.30560: ^ task is: TASK: Enable EPEL 8 27844 1726882743.30566: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.30571: getting variables 27844 1726882743.30573: in VariableManager get_vars() 27844 1726882743.30602: Calling all_inventory to load vars for managed_node1 27844 1726882743.30606: Calling groups_inventory to load vars for managed_node1 27844 1726882743.30610: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.30623: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.30627: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.30630: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.30842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.31037: done with get_vars() 27844 1726882743.31046: done getting variables 27844 1726882743.31119: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:39:03 -0400 (0:00:00.027) 0:00:02.388 ****** 27844 1726882743.31151: entering _queue_task() for managed_node1/command 27844 1726882743.31172: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f2 27844 1726882743.31183: WORKER PROCESS EXITING 27844 1726882743.31551: worker is 1 (out of 1 available) 27844 1726882743.31561: exiting _queue_task() for managed_node1/command 27844 1726882743.31573: done queuing things up, now waiting for results queue to drain 27844 1726882743.31575: waiting for pending results... 27844 1726882743.31797: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 27844 1726882743.31904: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000f3 27844 1726882743.31927: variable 'ansible_search_path' from source: unknown 27844 1726882743.31935: variable 'ansible_search_path' from source: unknown 27844 1726882743.31976: calling self._execute() 27844 1726882743.32052: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.32065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.32082: variable 'omit' from source: magic vars 27844 1726882743.32462: variable 'ansible_distribution' from source: facts 27844 1726882743.32480: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27844 1726882743.32603: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.32629: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27844 1726882743.32637: when evaluation is False, skipping this task 27844 1726882743.32642: _execute() done 27844 1726882743.32648: dumping result to json 27844 1726882743.32654: done dumping result, returning 27844 1726882743.32662: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-efa9-466a-0000000000f3] 27844 1726882743.32674: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f3 27844 1726882743.32761: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f3 27844 1726882743.32771: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27844 1726882743.32816: no more pending results, returning what we have 27844 1726882743.32820: results queue empty 27844 1726882743.32821: checking for any_errors_fatal 27844 1726882743.32826: done checking for any_errors_fatal 27844 1726882743.32827: checking for max_fail_percentage 27844 1726882743.32828: done checking for max_fail_percentage 27844 1726882743.32829: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.32830: done checking to see if all hosts have failed 27844 1726882743.32831: getting the remaining hosts for this loop 27844 1726882743.32832: done getting the remaining hosts for this loop 27844 1726882743.32835: getting the next task for host managed_node1 27844 1726882743.32842: done getting next task for host managed_node1 27844 1726882743.32844: ^ task is: TASK: Enable EPEL 6 27844 1726882743.32848: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.32851: getting variables 27844 1726882743.32853: in VariableManager get_vars() 27844 1726882743.32881: Calling all_inventory to load vars for managed_node1 27844 1726882743.32883: Calling groups_inventory to load vars for managed_node1 27844 1726882743.32887: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.32899: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.32902: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.32906: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.33088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.33296: done with get_vars() 27844 1726882743.33304: done getting variables 27844 1726882743.33365: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:39:03 -0400 (0:00:00.022) 0:00:02.410 ****** 27844 1726882743.33395: entering _queue_task() for managed_node1/copy 27844 1726882743.33726: worker is 1 (out of 1 available) 27844 1726882743.33737: exiting _queue_task() for managed_node1/copy 27844 1726882743.33747: done queuing things up, now waiting for results queue to drain 27844 1726882743.33749: waiting for pending results... 27844 1726882743.33959: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 27844 1726882743.34061: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000f5 27844 1726882743.34085: variable 'ansible_search_path' from source: unknown 27844 1726882743.34093: variable 'ansible_search_path' from source: unknown 27844 1726882743.34130: calling self._execute() 27844 1726882743.34202: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.34212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.34225: variable 'omit' from source: magic vars 27844 1726882743.35226: variable 'ansible_distribution' from source: facts 27844 1726882743.35247: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27844 1726882743.35650: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.35661: Evaluated conditional (ansible_distribution_major_version == '6'): False 27844 1726882743.35680: when evaluation is False, skipping this task 27844 1726882743.35702: _execute() done 27844 1726882743.35710: dumping result to json 27844 1726882743.35717: done dumping result, returning 27844 1726882743.35727: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-efa9-466a-0000000000f5] 27844 1726882743.35735: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27844 1726882743.35925: no more pending results, returning what we have 27844 1726882743.35929: results queue empty 27844 1726882743.35929: checking for any_errors_fatal 27844 1726882743.35933: done checking for any_errors_fatal 27844 1726882743.35933: checking for max_fail_percentage 27844 1726882743.35935: done checking for max_fail_percentage 27844 1726882743.35936: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.35937: done checking to see if all hosts have failed 27844 1726882743.35938: getting the remaining hosts for this loop 27844 1726882743.35939: done getting the remaining hosts for this loop 27844 1726882743.35943: getting the next task for host managed_node1 27844 1726882743.35951: done getting next task for host managed_node1 27844 1726882743.35953: ^ task is: TASK: Set network provider to 'nm' 27844 1726882743.35955: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.35959: getting variables 27844 1726882743.35961: in VariableManager get_vars() 27844 1726882743.35992: Calling all_inventory to load vars for managed_node1 27844 1726882743.35995: Calling groups_inventory to load vars for managed_node1 27844 1726882743.35998: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.36010: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.36014: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.36017: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.36235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.36496: done with get_vars() 27844 1726882743.36505: done getting variables 27844 1726882743.36578: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Friday 20 September 2024 21:39:03 -0400 (0:00:00.032) 0:00:02.442 ****** 27844 1726882743.36612: entering _queue_task() for managed_node1/set_fact 27844 1726882743.36627: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000f5 27844 1726882743.36633: WORKER PROCESS EXITING 27844 1726882743.37027: worker is 1 (out of 1 available) 27844 1726882743.37039: exiting _queue_task() for managed_node1/set_fact 27844 1726882743.37049: done queuing things up, now waiting for results queue to drain 27844 1726882743.37050: waiting for pending results... 27844 1726882743.37278: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 27844 1726882743.37372: in run() - task 0e448fcc-3ce9-efa9-466a-000000000007 27844 1726882743.37397: variable 'ansible_search_path' from source: unknown 27844 1726882743.37436: calling self._execute() 27844 1726882743.37512: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.37524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.37538: variable 'omit' from source: magic vars 27844 1726882743.37643: variable 'omit' from source: magic vars 27844 1726882743.37683: variable 'omit' from source: magic vars 27844 1726882743.37728: variable 'omit' from source: magic vars 27844 1726882743.37777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882743.37817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882743.37847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882743.37874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.37892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.37924: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882743.37936: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.37944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.38049: Set connection var ansible_shell_type to sh 27844 1726882743.38057: Set connection var ansible_connection to ssh 27844 1726882743.38087: Set connection var ansible_pipelining to False 27844 1726882743.38099: Set connection var ansible_timeout to 10 27844 1726882743.38109: Set connection var ansible_shell_executable to /bin/sh 27844 1726882743.38118: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882743.38172: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.38182: variable 'ansible_connection' from source: unknown 27844 1726882743.38189: variable 'ansible_module_compression' from source: unknown 27844 1726882743.38196: variable 'ansible_shell_type' from source: unknown 27844 1726882743.38202: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.38208: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.38238: variable 'ansible_pipelining' from source: unknown 27844 1726882743.38246: variable 'ansible_timeout' from source: unknown 27844 1726882743.38254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.38401: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882743.38428: variable 'omit' from source: magic vars 27844 1726882743.38438: starting attempt loop 27844 1726882743.38445: running the handler 27844 1726882743.38460: handler run complete 27844 1726882743.38482: attempt loop complete, returning result 27844 1726882743.38489: _execute() done 27844 1726882743.38496: dumping result to json 27844 1726882743.38503: done dumping result, returning 27844 1726882743.38515: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-efa9-466a-000000000007] 27844 1726882743.38524: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 27844 1726882743.38667: no more pending results, returning what we have 27844 1726882743.38670: results queue empty 27844 1726882743.38671: checking for any_errors_fatal 27844 1726882743.38679: done checking for any_errors_fatal 27844 1726882743.38680: checking for max_fail_percentage 27844 1726882743.38682: done checking for max_fail_percentage 27844 1726882743.38683: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.38684: done checking to see if all hosts have failed 27844 1726882743.38684: getting the remaining hosts for this loop 27844 1726882743.38686: done getting the remaining hosts for this loop 27844 1726882743.38690: getting the next task for host managed_node1 27844 1726882743.38697: done getting next task for host managed_node1 27844 1726882743.38700: ^ task is: TASK: meta (flush_handlers) 27844 1726882743.38702: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.38706: getting variables 27844 1726882743.38708: in VariableManager get_vars() 27844 1726882743.38738: Calling all_inventory to load vars for managed_node1 27844 1726882743.38741: Calling groups_inventory to load vars for managed_node1 27844 1726882743.38745: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.38756: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.38759: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.38763: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.38939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.39129: done with get_vars() 27844 1726882743.39139: done getting variables 27844 1726882743.39205: in VariableManager get_vars() 27844 1726882743.39214: Calling all_inventory to load vars for managed_node1 27844 1726882743.39216: Calling groups_inventory to load vars for managed_node1 27844 1726882743.39219: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.39223: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.39225: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.39228: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.39575: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000007 27844 1726882743.39578: WORKER PROCESS EXITING 27844 1726882743.39597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.39777: done with get_vars() 27844 1726882743.39790: done queuing things up, now waiting for results queue to drain 27844 1726882743.39792: results queue empty 27844 1726882743.39793: checking for any_errors_fatal 27844 1726882743.39795: done checking for any_errors_fatal 27844 1726882743.39795: checking for max_fail_percentage 27844 1726882743.39796: done checking for max_fail_percentage 27844 1726882743.39797: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.39798: done checking to see if all hosts have failed 27844 1726882743.39798: getting the remaining hosts for this loop 27844 1726882743.39799: done getting the remaining hosts for this loop 27844 1726882743.39802: getting the next task for host managed_node1 27844 1726882743.39805: done getting next task for host managed_node1 27844 1726882743.39807: ^ task is: TASK: meta (flush_handlers) 27844 1726882743.39808: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.39816: getting variables 27844 1726882743.39817: in VariableManager get_vars() 27844 1726882743.39824: Calling all_inventory to load vars for managed_node1 27844 1726882743.39827: Calling groups_inventory to load vars for managed_node1 27844 1726882743.39829: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.39833: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.39835: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.39838: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.39969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.40325: done with get_vars() 27844 1726882743.40331: done getting variables 27844 1726882743.40372: in VariableManager get_vars() 27844 1726882743.40380: Calling all_inventory to load vars for managed_node1 27844 1726882743.40382: Calling groups_inventory to load vars for managed_node1 27844 1726882743.40384: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.40388: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.40390: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.40393: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.40527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.40743: done with get_vars() 27844 1726882743.40752: done queuing things up, now waiting for results queue to drain 27844 1726882743.40754: results queue empty 27844 1726882743.40755: checking for any_errors_fatal 27844 1726882743.40756: done checking for any_errors_fatal 27844 1726882743.40757: checking for max_fail_percentage 27844 1726882743.40757: done checking for max_fail_percentage 27844 1726882743.40758: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.40759: done checking to see if all hosts have failed 27844 1726882743.40760: getting the remaining hosts for this loop 27844 1726882743.40760: done getting the remaining hosts for this loop 27844 1726882743.40763: getting the next task for host managed_node1 27844 1726882743.40767: done getting next task for host managed_node1 27844 1726882743.40768: ^ task is: None 27844 1726882743.40770: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.40771: done queuing things up, now waiting for results queue to drain 27844 1726882743.40772: results queue empty 27844 1726882743.40772: checking for any_errors_fatal 27844 1726882743.40773: done checking for any_errors_fatal 27844 1726882743.40774: checking for max_fail_percentage 27844 1726882743.40775: done checking for max_fail_percentage 27844 1726882743.40775: checking to see if all hosts have failed and the running result is not ok 27844 1726882743.40776: done checking to see if all hosts have failed 27844 1726882743.40778: getting the next task for host managed_node1 27844 1726882743.40780: done getting next task for host managed_node1 27844 1726882743.40780: ^ task is: None 27844 1726882743.40782: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.41358: in VariableManager get_vars() 27844 1726882743.41385: done with get_vars() 27844 1726882743.41391: in VariableManager get_vars() 27844 1726882743.41405: done with get_vars() 27844 1726882743.41410: variable 'omit' from source: magic vars 27844 1726882743.41440: in VariableManager get_vars() 27844 1726882743.41455: done with get_vars() 27844 1726882743.41481: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 27844 1726882743.41918: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 27844 1726882743.42151: getting the remaining hosts for this loop 27844 1726882743.42153: done getting the remaining hosts for this loop 27844 1726882743.42155: getting the next task for host managed_node1 27844 1726882743.42158: done getting next task for host managed_node1 27844 1726882743.42160: ^ task is: TASK: Gathering Facts 27844 1726882743.42161: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882743.42166: getting variables 27844 1726882743.42167: in VariableManager get_vars() 27844 1726882743.42179: Calling all_inventory to load vars for managed_node1 27844 1726882743.42181: Calling groups_inventory to load vars for managed_node1 27844 1726882743.42183: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882743.42187: Calling all_plugins_play to load vars for managed_node1 27844 1726882743.42201: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882743.42204: Calling groups_plugins_play to load vars for managed_node1 27844 1726882743.42377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882743.42573: done with get_vars() 27844 1726882743.42581: done getting variables 27844 1726882743.42616: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Friday 20 September 2024 21:39:03 -0400 (0:00:00.060) 0:00:02.502 ****** 27844 1726882743.42637: entering _queue_task() for managed_node1/gather_facts 27844 1726882743.43045: worker is 1 (out of 1 available) 27844 1726882743.43055: exiting _queue_task() for managed_node1/gather_facts 27844 1726882743.43067: done queuing things up, now waiting for results queue to drain 27844 1726882743.43069: waiting for pending results... 27844 1726882743.43303: running TaskExecutor() for managed_node1/TASK: Gathering Facts 27844 1726882743.43399: in run() - task 0e448fcc-3ce9-efa9-466a-00000000011b 27844 1726882743.43419: variable 'ansible_search_path' from source: unknown 27844 1726882743.43456: calling self._execute() 27844 1726882743.43534: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.43547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.43560: variable 'omit' from source: magic vars 27844 1726882743.43939: variable 'ansible_distribution_major_version' from source: facts 27844 1726882743.43961: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882743.43975: variable 'omit' from source: magic vars 27844 1726882743.44001: variable 'omit' from source: magic vars 27844 1726882743.44039: variable 'omit' from source: magic vars 27844 1726882743.44090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882743.44127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882743.44151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882743.44179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.44196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882743.44228: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882743.44238: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.44245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.44350: Set connection var ansible_shell_type to sh 27844 1726882743.44358: Set connection var ansible_connection to ssh 27844 1726882743.44372: Set connection var ansible_pipelining to False 27844 1726882743.44387: Set connection var ansible_timeout to 10 27844 1726882743.44397: Set connection var ansible_shell_executable to /bin/sh 27844 1726882743.44407: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882743.44436: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.44445: variable 'ansible_connection' from source: unknown 27844 1726882743.44453: variable 'ansible_module_compression' from source: unknown 27844 1726882743.44459: variable 'ansible_shell_type' from source: unknown 27844 1726882743.44469: variable 'ansible_shell_executable' from source: unknown 27844 1726882743.44476: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882743.44487: variable 'ansible_pipelining' from source: unknown 27844 1726882743.44494: variable 'ansible_timeout' from source: unknown 27844 1726882743.44501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882743.44675: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882743.44692: variable 'omit' from source: magic vars 27844 1726882743.44705: starting attempt loop 27844 1726882743.44712: running the handler 27844 1726882743.44730: variable 'ansible_facts' from source: unknown 27844 1726882743.44754: _low_level_execute_command(): starting 27844 1726882743.44769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882743.45509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882743.45525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.45542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.45564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.45609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.45623: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882743.45638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.45658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882743.45674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882743.45690: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882743.45703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.45717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.45734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.45748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.45761: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882743.45779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.45857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882743.45883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882743.45901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882743.46030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882743.47690: stdout chunk (state=3): >>>/root <<< 27844 1726882743.47797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882743.47874: stderr chunk (state=3): >>><<< 27844 1726882743.47888: stdout chunk (state=3): >>><<< 27844 1726882743.47971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882743.47974: _low_level_execute_command(): starting 27844 1726882743.47978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460 `" && echo ansible-tmp-1726882743.4791543-28014-249434955489460="` echo /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460 `" ) && sleep 0' 27844 1726882743.48608: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882743.48623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.48637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.48656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.48700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.48713: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882743.48728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.48746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882743.48759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882743.48774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882743.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.48801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.48816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.48835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.48848: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882743.48862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.48936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882743.48959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882743.48979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882743.49103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882743.50951: stdout chunk (state=3): >>>ansible-tmp-1726882743.4791543-28014-249434955489460=/root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460 <<< 27844 1726882743.51176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882743.51179: stdout chunk (state=3): >>><<< 27844 1726882743.51182: stderr chunk (state=3): >>><<< 27844 1726882743.51471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882743.4791543-28014-249434955489460=/root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882743.51475: variable 'ansible_module_compression' from source: unknown 27844 1726882743.51478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27844 1726882743.51480: variable 'ansible_facts' from source: unknown 27844 1726882743.51523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/AnsiballZ_setup.py 27844 1726882743.51691: Sending initial data 27844 1726882743.51694: Sent initial data (154 bytes) 27844 1726882743.52689: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882743.52707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.52722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.52740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.52788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.52808: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882743.52824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.52842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882743.52854: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882743.52871: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882743.52885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.52902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.52925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.52938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.52949: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882743.52962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.53047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882743.53068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882743.53084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882743.53208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882743.54930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882743.55020: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882743.55124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpoisiq9hb /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/AnsiballZ_setup.py <<< 27844 1726882743.55219: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882743.58109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882743.58339: stderr chunk (state=3): >>><<< 27844 1726882743.58343: stdout chunk (state=3): >>><<< 27844 1726882743.58345: done transferring module to remote 27844 1726882743.58350: _low_level_execute_command(): starting 27844 1726882743.58353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/ /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/AnsiballZ_setup.py && sleep 0' 27844 1726882743.59077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.59081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.59118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882743.59123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.59126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.59185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882743.59193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882743.59293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882743.61023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882743.61089: stderr chunk (state=3): >>><<< 27844 1726882743.61092: stdout chunk (state=3): >>><<< 27844 1726882743.61178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882743.61182: _low_level_execute_command(): starting 27844 1726882743.61185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/AnsiballZ_setup.py && sleep 0' 27844 1726882743.61736: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882743.61752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.61774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.61793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.61834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.61851: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882743.61898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.61917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882743.61931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882743.61943: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882743.61959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882743.62003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882743.62021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882743.62037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882743.62050: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882743.62102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882743.62199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882743.62225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882743.62242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882743.62399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.13282: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHV<<< 27844 1726882744.13342: stdout chunk (state=3): >>>IxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "03", "epoch": "1726882743", "epoch_int": "1726882743", "date": "2024-09-20", "time": "21:39:03", "iso8601_micro": "2024-09-21T01:39:03.865668Z", "iso8601": "2024-09-21T01:39:03Z", "iso8601_basic": "20240920T213903865668", "iso8601_basic_short": "20240920T213903", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.73, "5m": 0.51, "15m": 0.29}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 901, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238891008, "block_size": 4096, "block_total": 65519355, "block_available": 64511448, "block_used": 1007907, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp<<< 27844 1726882744.13353: stdout chunk (state=3): >>>_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27844 1726882744.15023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882744.15035: stdout chunk (state=3): >>><<< 27844 1726882744.15039: stderr chunk (state=3): >>><<< 27844 1726882744.15384: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "03", "epoch": "1726882743", "epoch_int": "1726882743", "date": "2024-09-20", "time": "21:39:03", "iso8601_micro": "2024-09-21T01:39:03.865668Z", "iso8601": "2024-09-21T01:39:03Z", "iso8601_basic": "20240920T213903865668", "iso8601_basic_short": "20240920T213903", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.73, "5m": 0.51, "15m": 0.29}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 901, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238891008, "block_size": 4096, "block_total": 65519355, "block_available": 64511448, "block_used": 1007907, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882744.15439: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882744.15467: _low_level_execute_command(): starting 27844 1726882744.15478: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882743.4791543-28014-249434955489460/ > /dev/null 2>&1 && sleep 0' 27844 1726882744.18097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882744.18114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.18175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.18194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.18240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.18282: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882744.18297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.18315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882744.18327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882744.18391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882744.18405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.18420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.18504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.18518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.18529: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882744.18543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.18736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.18836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.18855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.18985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.20901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882744.20904: stdout chunk (state=3): >>><<< 27844 1726882744.20907: stderr chunk (state=3): >>><<< 27844 1726882744.21172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882744.21176: handler run complete 27844 1726882744.21178: variable 'ansible_facts' from source: unknown 27844 1726882744.21180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.21516: variable 'ansible_facts' from source: unknown 27844 1726882744.21717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.21938: attempt loop complete, returning result 27844 1726882744.21947: _execute() done 27844 1726882744.21954: dumping result to json 27844 1726882744.21990: done dumping result, returning 27844 1726882744.22004: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-efa9-466a-00000000011b] 27844 1726882744.22013: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000011b ok: [managed_node1] 27844 1726882744.22862: no more pending results, returning what we have 27844 1726882744.22867: results queue empty 27844 1726882744.22868: checking for any_errors_fatal 27844 1726882744.22870: done checking for any_errors_fatal 27844 1726882744.22870: checking for max_fail_percentage 27844 1726882744.22872: done checking for max_fail_percentage 27844 1726882744.22873: checking to see if all hosts have failed and the running result is not ok 27844 1726882744.22874: done checking to see if all hosts have failed 27844 1726882744.22875: getting the remaining hosts for this loop 27844 1726882744.22877: done getting the remaining hosts for this loop 27844 1726882744.22880: getting the next task for host managed_node1 27844 1726882744.22888: done getting next task for host managed_node1 27844 1726882744.22890: ^ task is: TASK: meta (flush_handlers) 27844 1726882744.22893: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882744.22897: getting variables 27844 1726882744.22899: in VariableManager get_vars() 27844 1726882744.22943: Calling all_inventory to load vars for managed_node1 27844 1726882744.22946: Calling groups_inventory to load vars for managed_node1 27844 1726882744.22949: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.22962: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.22967: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.22970: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.23128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.23637: done with get_vars() 27844 1726882744.23647: done getting variables 27844 1726882744.24371: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000011b 27844 1726882744.24375: WORKER PROCESS EXITING 27844 1726882744.24418: in VariableManager get_vars() 27844 1726882744.24433: Calling all_inventory to load vars for managed_node1 27844 1726882744.24435: Calling groups_inventory to load vars for managed_node1 27844 1726882744.24437: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.24441: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.24443: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.24450: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.24593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.24815: done with get_vars() 27844 1726882744.24828: done queuing things up, now waiting for results queue to drain 27844 1726882744.24830: results queue empty 27844 1726882744.24830: checking for any_errors_fatal 27844 1726882744.24834: done checking for any_errors_fatal 27844 1726882744.24835: checking for max_fail_percentage 27844 1726882744.24836: done checking for max_fail_percentage 27844 1726882744.24836: checking to see if all hosts have failed and the running result is not ok 27844 1726882744.24837: done checking to see if all hosts have failed 27844 1726882744.24838: getting the remaining hosts for this loop 27844 1726882744.24839: done getting the remaining hosts for this loop 27844 1726882744.24842: getting the next task for host managed_node1 27844 1726882744.24845: done getting next task for host managed_node1 27844 1726882744.24847: ^ task is: TASK: Set type and interface0 27844 1726882744.24849: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882744.24851: getting variables 27844 1726882744.24852: in VariableManager get_vars() 27844 1726882744.24869: Calling all_inventory to load vars for managed_node1 27844 1726882744.24871: Calling groups_inventory to load vars for managed_node1 27844 1726882744.24873: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.24878: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.24880: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.24883: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.25021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.25206: done with get_vars() 27844 1726882744.25214: done getting variables 27844 1726882744.25253: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Friday 20 September 2024 21:39:04 -0400 (0:00:00.833) 0:00:03.336 ****** 27844 1726882744.25984: entering _queue_task() for managed_node1/set_fact 27844 1726882744.26276: worker is 1 (out of 1 available) 27844 1726882744.26288: exiting _queue_task() for managed_node1/set_fact 27844 1726882744.26300: done queuing things up, now waiting for results queue to drain 27844 1726882744.26302: waiting for pending results... 27844 1726882744.28027: running TaskExecutor() for managed_node1/TASK: Set type and interface0 27844 1726882744.28099: in run() - task 0e448fcc-3ce9-efa9-466a-00000000000b 27844 1726882744.28120: variable 'ansible_search_path' from source: unknown 27844 1726882744.28162: calling self._execute() 27844 1726882744.28242: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.28246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.28255: variable 'omit' from source: magic vars 27844 1726882744.29178: variable 'ansible_distribution_major_version' from source: facts 27844 1726882744.29282: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882744.29294: variable 'omit' from source: magic vars 27844 1726882744.29366: variable 'omit' from source: magic vars 27844 1726882744.29400: variable 'type' from source: play vars 27844 1726882744.29676: variable 'type' from source: play vars 27844 1726882744.29703: variable 'interface0' from source: play vars 27844 1726882744.29893: variable 'interface0' from source: play vars 27844 1726882744.29916: variable 'omit' from source: magic vars 27844 1726882744.29959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882744.30117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882744.30143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882744.30169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882744.30187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882744.30227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882744.30325: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.30334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.30527: Set connection var ansible_shell_type to sh 27844 1726882744.30541: Set connection var ansible_connection to ssh 27844 1726882744.30552: Set connection var ansible_pipelining to False 27844 1726882744.30562: Set connection var ansible_timeout to 10 27844 1726882744.30579: Set connection var ansible_shell_executable to /bin/sh 27844 1726882744.30591: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882744.30679: variable 'ansible_shell_executable' from source: unknown 27844 1726882744.30688: variable 'ansible_connection' from source: unknown 27844 1726882744.30696: variable 'ansible_module_compression' from source: unknown 27844 1726882744.30703: variable 'ansible_shell_type' from source: unknown 27844 1726882744.30760: variable 'ansible_shell_executable' from source: unknown 27844 1726882744.30774: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.30784: variable 'ansible_pipelining' from source: unknown 27844 1726882744.30792: variable 'ansible_timeout' from source: unknown 27844 1726882744.30800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.31043: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882744.31094: variable 'omit' from source: magic vars 27844 1726882744.31105: starting attempt loop 27844 1726882744.31108: running the handler 27844 1726882744.31117: handler run complete 27844 1726882744.31183: attempt loop complete, returning result 27844 1726882744.31193: _execute() done 27844 1726882744.31198: dumping result to json 27844 1726882744.31204: done dumping result, returning 27844 1726882744.31214: done running TaskExecutor() for managed_node1/TASK: Set type and interface0 [0e448fcc-3ce9-efa9-466a-00000000000b] 27844 1726882744.31222: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000b ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 27844 1726882744.31377: no more pending results, returning what we have 27844 1726882744.31381: results queue empty 27844 1726882744.31382: checking for any_errors_fatal 27844 1726882744.31384: done checking for any_errors_fatal 27844 1726882744.31385: checking for max_fail_percentage 27844 1726882744.31386: done checking for max_fail_percentage 27844 1726882744.31387: checking to see if all hosts have failed and the running result is not ok 27844 1726882744.31388: done checking to see if all hosts have failed 27844 1726882744.31389: getting the remaining hosts for this loop 27844 1726882744.31391: done getting the remaining hosts for this loop 27844 1726882744.31394: getting the next task for host managed_node1 27844 1726882744.31400: done getting next task for host managed_node1 27844 1726882744.31403: ^ task is: TASK: Show interfaces 27844 1726882744.31405: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882744.31408: getting variables 27844 1726882744.31410: in VariableManager get_vars() 27844 1726882744.31454: Calling all_inventory to load vars for managed_node1 27844 1726882744.31457: Calling groups_inventory to load vars for managed_node1 27844 1726882744.31459: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.31474: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.31477: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.31480: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.31647: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000b 27844 1726882744.31651: WORKER PROCESS EXITING 27844 1726882744.31666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.31892: done with get_vars() 27844 1726882744.31902: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Friday 20 September 2024 21:39:04 -0400 (0:00:00.061) 0:00:03.397 ****** 27844 1726882744.32104: entering _queue_task() for managed_node1/include_tasks 27844 1726882744.32569: worker is 1 (out of 1 available) 27844 1726882744.32580: exiting _queue_task() for managed_node1/include_tasks 27844 1726882744.32592: done queuing things up, now waiting for results queue to drain 27844 1726882744.32593: waiting for pending results... 27844 1726882744.33448: running TaskExecutor() for managed_node1/TASK: Show interfaces 27844 1726882744.33718: in run() - task 0e448fcc-3ce9-efa9-466a-00000000000c 27844 1726882744.33737: variable 'ansible_search_path' from source: unknown 27844 1726882744.33822: calling self._execute() 27844 1726882744.33910: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.34041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.34073: variable 'omit' from source: magic vars 27844 1726882744.35042: variable 'ansible_distribution_major_version' from source: facts 27844 1726882744.35068: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882744.35132: _execute() done 27844 1726882744.35139: dumping result to json 27844 1726882744.35147: done dumping result, returning 27844 1726882744.35161: done running TaskExecutor() for managed_node1/TASK: Show interfaces [0e448fcc-3ce9-efa9-466a-00000000000c] 27844 1726882744.35175: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000c 27844 1726882744.35345: no more pending results, returning what we have 27844 1726882744.35351: in VariableManager get_vars() 27844 1726882744.35400: Calling all_inventory to load vars for managed_node1 27844 1726882744.35402: Calling groups_inventory to load vars for managed_node1 27844 1726882744.35404: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.35420: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.35423: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.35427: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.35591: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000c 27844 1726882744.35594: WORKER PROCESS EXITING 27844 1726882744.35617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.35809: done with get_vars() 27844 1726882744.35817: variable 'ansible_search_path' from source: unknown 27844 1726882744.35830: we have included files to process 27844 1726882744.35831: generating all_blocks data 27844 1726882744.35832: done generating all_blocks data 27844 1726882744.35833: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882744.35834: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882744.35837: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882744.36003: in VariableManager get_vars() 27844 1726882744.36026: done with get_vars() 27844 1726882744.36137: done processing included file 27844 1726882744.36139: iterating over new_blocks loaded from include file 27844 1726882744.36141: in VariableManager get_vars() 27844 1726882744.36158: done with get_vars() 27844 1726882744.36160: filtering new block on tags 27844 1726882744.36178: done filtering new block on tags 27844 1726882744.36180: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 27844 1726882744.36185: extending task lists for all hosts with included blocks 27844 1726882744.36893: done extending task lists 27844 1726882744.36894: done processing included files 27844 1726882744.36895: results queue empty 27844 1726882744.36896: checking for any_errors_fatal 27844 1726882744.36900: done checking for any_errors_fatal 27844 1726882744.36900: checking for max_fail_percentage 27844 1726882744.36902: done checking for max_fail_percentage 27844 1726882744.36902: checking to see if all hosts have failed and the running result is not ok 27844 1726882744.36903: done checking to see if all hosts have failed 27844 1726882744.36904: getting the remaining hosts for this loop 27844 1726882744.36905: done getting the remaining hosts for this loop 27844 1726882744.36908: getting the next task for host managed_node1 27844 1726882744.36911: done getting next task for host managed_node1 27844 1726882744.36914: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27844 1726882744.36916: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882744.36918: getting variables 27844 1726882744.36920: in VariableManager get_vars() 27844 1726882744.36932: Calling all_inventory to load vars for managed_node1 27844 1726882744.36934: Calling groups_inventory to load vars for managed_node1 27844 1726882744.36936: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.36941: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.37277: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.37282: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.37845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.38042: done with get_vars() 27844 1726882744.38051: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:04 -0400 (0:00:00.060) 0:00:03.457 ****** 27844 1726882744.38122: entering _queue_task() for managed_node1/include_tasks 27844 1726882744.38479: worker is 1 (out of 1 available) 27844 1726882744.38604: exiting _queue_task() for managed_node1/include_tasks 27844 1726882744.38616: done queuing things up, now waiting for results queue to drain 27844 1726882744.38618: waiting for pending results... 27844 1726882744.39598: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 27844 1726882744.39697: in run() - task 0e448fcc-3ce9-efa9-466a-000000000135 27844 1726882744.39787: variable 'ansible_search_path' from source: unknown 27844 1726882744.39796: variable 'ansible_search_path' from source: unknown 27844 1726882744.39837: calling self._execute() 27844 1726882744.40047: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.40058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.40110: variable 'omit' from source: magic vars 27844 1726882744.40844: variable 'ansible_distribution_major_version' from source: facts 27844 1726882744.40983: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882744.40994: _execute() done 27844 1726882744.41000: dumping result to json 27844 1726882744.41007: done dumping result, returning 27844 1726882744.41015: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-efa9-466a-000000000135] 27844 1726882744.41023: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000135 27844 1726882744.41134: no more pending results, returning what we have 27844 1726882744.41139: in VariableManager get_vars() 27844 1726882744.41184: Calling all_inventory to load vars for managed_node1 27844 1726882744.41187: Calling groups_inventory to load vars for managed_node1 27844 1726882744.41190: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.41204: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.41207: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.41210: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.41390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.41603: done with get_vars() 27844 1726882744.41611: variable 'ansible_search_path' from source: unknown 27844 1726882744.41612: variable 'ansible_search_path' from source: unknown 27844 1726882744.41650: we have included files to process 27844 1726882744.41651: generating all_blocks data 27844 1726882744.41653: done generating all_blocks data 27844 1726882744.41654: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882744.41655: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882744.41657: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882744.42109: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000135 27844 1726882744.42112: WORKER PROCESS EXITING 27844 1726882744.42481: done processing included file 27844 1726882744.42483: iterating over new_blocks loaded from include file 27844 1726882744.42485: in VariableManager get_vars() 27844 1726882744.42502: done with get_vars() 27844 1726882744.42504: filtering new block on tags 27844 1726882744.42520: done filtering new block on tags 27844 1726882744.42522: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 27844 1726882744.42527: extending task lists for all hosts with included blocks 27844 1726882744.42859: done extending task lists 27844 1726882744.42861: done processing included files 27844 1726882744.42862: results queue empty 27844 1726882744.42862: checking for any_errors_fatal 27844 1726882744.42867: done checking for any_errors_fatal 27844 1726882744.42868: checking for max_fail_percentage 27844 1726882744.42869: done checking for max_fail_percentage 27844 1726882744.42870: checking to see if all hosts have failed and the running result is not ok 27844 1726882744.42870: done checking to see if all hosts have failed 27844 1726882744.42871: getting the remaining hosts for this loop 27844 1726882744.42872: done getting the remaining hosts for this loop 27844 1726882744.42875: getting the next task for host managed_node1 27844 1726882744.42879: done getting next task for host managed_node1 27844 1726882744.42881: ^ task is: TASK: Gather current interface info 27844 1726882744.42884: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882744.42886: getting variables 27844 1726882744.42887: in VariableManager get_vars() 27844 1726882744.42899: Calling all_inventory to load vars for managed_node1 27844 1726882744.42901: Calling groups_inventory to load vars for managed_node1 27844 1726882744.42903: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882744.42908: Calling all_plugins_play to load vars for managed_node1 27844 1726882744.42910: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882744.42913: Calling groups_plugins_play to load vars for managed_node1 27844 1726882744.43143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882744.43336: done with get_vars() 27844 1726882744.43345: done getting variables 27844 1726882744.43384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:04 -0400 (0:00:00.052) 0:00:03.510 ****** 27844 1726882744.43410: entering _queue_task() for managed_node1/command 27844 1726882744.44093: worker is 1 (out of 1 available) 27844 1726882744.44106: exiting _queue_task() for managed_node1/command 27844 1726882744.44120: done queuing things up, now waiting for results queue to drain 27844 1726882744.44126: waiting for pending results... 27844 1726882744.45009: running TaskExecutor() for managed_node1/TASK: Gather current interface info 27844 1726882744.45549: in run() - task 0e448fcc-3ce9-efa9-466a-00000000014e 27844 1726882744.45573: variable 'ansible_search_path' from source: unknown 27844 1726882744.45582: variable 'ansible_search_path' from source: unknown 27844 1726882744.45624: calling self._execute() 27844 1726882744.45710: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.45722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.45735: variable 'omit' from source: magic vars 27844 1726882744.46302: variable 'ansible_distribution_major_version' from source: facts 27844 1726882744.46487: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882744.46497: variable 'omit' from source: magic vars 27844 1726882744.46539: variable 'omit' from source: magic vars 27844 1726882744.46781: variable 'omit' from source: magic vars 27844 1726882744.46822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882744.46858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882744.46886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882744.46907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882744.46922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882744.46951: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882744.46959: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.46971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.47079: Set connection var ansible_shell_type to sh 27844 1726882744.47176: Set connection var ansible_connection to ssh 27844 1726882744.47879: Set connection var ansible_pipelining to False 27844 1726882744.47891: Set connection var ansible_timeout to 10 27844 1726882744.47902: Set connection var ansible_shell_executable to /bin/sh 27844 1726882744.47912: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882744.47944: variable 'ansible_shell_executable' from source: unknown 27844 1726882744.47947: variable 'ansible_connection' from source: unknown 27844 1726882744.47950: variable 'ansible_module_compression' from source: unknown 27844 1726882744.47953: variable 'ansible_shell_type' from source: unknown 27844 1726882744.47955: variable 'ansible_shell_executable' from source: unknown 27844 1726882744.47957: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882744.47959: variable 'ansible_pipelining' from source: unknown 27844 1726882744.47963: variable 'ansible_timeout' from source: unknown 27844 1726882744.47977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882744.48115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882744.48132: variable 'omit' from source: magic vars 27844 1726882744.48142: starting attempt loop 27844 1726882744.48149: running the handler 27844 1726882744.48173: _low_level_execute_command(): starting 27844 1726882744.48186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882744.50548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.50553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.50568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.50582: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882744.50597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.50622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882744.50637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882744.50649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882744.50661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.50681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.50699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.50712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.50732: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882744.50752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.50828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.50861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.50882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.51013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.52686: stdout chunk (state=3): >>>/root <<< 27844 1726882744.52880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882744.52883: stdout chunk (state=3): >>><<< 27844 1726882744.52886: stderr chunk (state=3): >>><<< 27844 1726882744.53016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882744.53020: _low_level_execute_command(): starting 27844 1726882744.53023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202 `" && echo ansible-tmp-1726882744.5291028-28072-129111486680202="` echo /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202 `" ) && sleep 0' 27844 1726882744.54338: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.54349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.54376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.54380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.54382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.54443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.55187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.55190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.55299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.57177: stdout chunk (state=3): >>>ansible-tmp-1726882744.5291028-28072-129111486680202=/root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202 <<< 27844 1726882744.57351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882744.57354: stdout chunk (state=3): >>><<< 27844 1726882744.57363: stderr chunk (state=3): >>><<< 27844 1726882744.57471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882744.5291028-28072-129111486680202=/root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882744.57476: variable 'ansible_module_compression' from source: unknown 27844 1726882744.57671: ANSIBALLZ: Using generic lock for ansible.legacy.command 27844 1726882744.57675: ANSIBALLZ: Acquiring lock 27844 1726882744.57681: ANSIBALLZ: Lock acquired: 139916607833536 27844 1726882744.57684: ANSIBALLZ: Creating module 27844 1726882744.81449: ANSIBALLZ: Writing module into payload 27844 1726882744.81573: ANSIBALLZ: Writing module 27844 1726882744.81896: ANSIBALLZ: Renaming module 27844 1726882744.81906: ANSIBALLZ: Done creating module 27844 1726882744.81927: variable 'ansible_facts' from source: unknown 27844 1726882744.82019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/AnsiballZ_command.py 27844 1726882744.82625: Sending initial data 27844 1726882744.82628: Sent initial data (156 bytes) 27844 1726882744.84392: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882744.85084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.85099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.85116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.85158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.85176: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882744.85191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.85208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882744.85219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882744.85229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882744.85241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.85256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.85282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.85295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.85307: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882744.85321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.85400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.85422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.85437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.85587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.87425: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 27844 1726882744.87429: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882744.87512: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882744.87608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp7xigtilq /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/AnsiballZ_command.py <<< 27844 1726882744.87705: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882744.89080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882744.89246: stderr chunk (state=3): >>><<< 27844 1726882744.89250: stdout chunk (state=3): >>><<< 27844 1726882744.89253: done transferring module to remote 27844 1726882744.89256: _low_level_execute_command(): starting 27844 1726882744.89259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/ /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/AnsiballZ_command.py && sleep 0' 27844 1726882744.90216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882744.90768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.90788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.90808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.90849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.90981: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882744.90997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.91015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882744.91028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882744.91039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882744.91051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.91070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.91089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.91102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.91114: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882744.91128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.91205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.91223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.91237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.91433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882744.93271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882744.93275: stdout chunk (state=3): >>><<< 27844 1726882744.93278: stderr chunk (state=3): >>><<< 27844 1726882744.93280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882744.93282: _low_level_execute_command(): starting 27844 1726882744.93285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/AnsiballZ_command.py && sleep 0' 27844 1726882744.93872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882744.93889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.93905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.93922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.93967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.93982: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882744.93995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.94011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882744.94021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882744.94030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882744.94041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882744.94052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882744.94074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882744.94088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882744.94101: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882744.94115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882744.94193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882744.94209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882744.94421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882744.94552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.07944: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:05.074582", "end": "2024-09-20 21:39:05.077757", "delta": "0:00:00.003175", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882745.09168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882745.09172: stdout chunk (state=3): >>><<< 27844 1726882745.09174: stderr chunk (state=3): >>><<< 27844 1726882745.09272: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:05.074582", "end": "2024-09-20 21:39:05.077757", "delta": "0:00:00.003175", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882745.09281: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882745.09284: _low_level_execute_command(): starting 27844 1726882745.09286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882744.5291028-28072-129111486680202/ > /dev/null 2>&1 && sleep 0' 27844 1726882745.10516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.10530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.10543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.10560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.10607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.10619: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.10632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.10648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.10659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.10675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.10687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.10699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.10717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.10729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.10741: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.10755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.10832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.10856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.10877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.11004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.12877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.12880: stdout chunk (state=3): >>><<< 27844 1726882745.12882: stderr chunk (state=3): >>><<< 27844 1726882745.13280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.13283: handler run complete 27844 1726882745.13286: Evaluated conditional (False): False 27844 1726882745.13288: attempt loop complete, returning result 27844 1726882745.13290: _execute() done 27844 1726882745.13292: dumping result to json 27844 1726882745.13294: done dumping result, returning 27844 1726882745.13296: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-efa9-466a-00000000014e] 27844 1726882745.13298: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000014e 27844 1726882745.13378: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000014e 27844 1726882745.13383: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003175", "end": "2024-09-20 21:39:05.077757", "rc": 0, "start": "2024-09-20 21:39:05.074582" } STDOUT: bonding_masters eth0 lo 27844 1726882745.13455: no more pending results, returning what we have 27844 1726882745.13459: results queue empty 27844 1726882745.13459: checking for any_errors_fatal 27844 1726882745.13461: done checking for any_errors_fatal 27844 1726882745.13462: checking for max_fail_percentage 27844 1726882745.13469: done checking for max_fail_percentage 27844 1726882745.13470: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.13471: done checking to see if all hosts have failed 27844 1726882745.13472: getting the remaining hosts for this loop 27844 1726882745.13473: done getting the remaining hosts for this loop 27844 1726882745.13477: getting the next task for host managed_node1 27844 1726882745.13483: done getting next task for host managed_node1 27844 1726882745.13486: ^ task is: TASK: Set current_interfaces 27844 1726882745.13489: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.13493: getting variables 27844 1726882745.13494: in VariableManager get_vars() 27844 1726882745.13533: Calling all_inventory to load vars for managed_node1 27844 1726882745.13536: Calling groups_inventory to load vars for managed_node1 27844 1726882745.13538: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.13548: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.13551: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.13555: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.13776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.14060: done with get_vars() 27844 1726882745.14078: done getting variables 27844 1726882745.14140: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:05 -0400 (0:00:00.709) 0:00:04.220 ****** 27844 1726882745.14375: entering _queue_task() for managed_node1/set_fact 27844 1726882745.14830: worker is 1 (out of 1 available) 27844 1726882745.14842: exiting _queue_task() for managed_node1/set_fact 27844 1726882745.14853: done queuing things up, now waiting for results queue to drain 27844 1726882745.14855: waiting for pending results... 27844 1726882745.15753: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 27844 1726882745.15831: in run() - task 0e448fcc-3ce9-efa9-466a-00000000014f 27844 1726882745.15956: variable 'ansible_search_path' from source: unknown 27844 1726882745.15968: variable 'ansible_search_path' from source: unknown 27844 1726882745.16013: calling self._execute() 27844 1726882745.16100: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.16180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.16194: variable 'omit' from source: magic vars 27844 1726882745.16907: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.17054: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.17069: variable 'omit' from source: magic vars 27844 1726882745.17115: variable 'omit' from source: magic vars 27844 1726882745.17225: variable '_current_interfaces' from source: set_fact 27844 1726882745.17425: variable 'omit' from source: magic vars 27844 1726882745.17509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.17616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.17641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.17665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.17808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.17843: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.17852: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.17861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.17979: Set connection var ansible_shell_type to sh 27844 1726882745.18107: Set connection var ansible_connection to ssh 27844 1726882745.18123: Set connection var ansible_pipelining to False 27844 1726882745.18132: Set connection var ansible_timeout to 10 27844 1726882745.18138: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.18144: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.18172: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.18176: variable 'ansible_connection' from source: unknown 27844 1726882745.18180: variable 'ansible_module_compression' from source: unknown 27844 1726882745.18182: variable 'ansible_shell_type' from source: unknown 27844 1726882745.18185: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.18187: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.18189: variable 'ansible_pipelining' from source: unknown 27844 1726882745.18191: variable 'ansible_timeout' from source: unknown 27844 1726882745.18193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.18440: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.18590: variable 'omit' from source: magic vars 27844 1726882745.18595: starting attempt loop 27844 1726882745.18598: running the handler 27844 1726882745.18610: handler run complete 27844 1726882745.18621: attempt loop complete, returning result 27844 1726882745.18623: _execute() done 27844 1726882745.18626: dumping result to json 27844 1726882745.18630: done dumping result, returning 27844 1726882745.18635: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-efa9-466a-00000000014f] 27844 1726882745.18638: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000014f ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 27844 1726882745.18789: no more pending results, returning what we have 27844 1726882745.18793: results queue empty 27844 1726882745.18793: checking for any_errors_fatal 27844 1726882745.18800: done checking for any_errors_fatal 27844 1726882745.18801: checking for max_fail_percentage 27844 1726882745.18802: done checking for max_fail_percentage 27844 1726882745.18802: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.18804: done checking to see if all hosts have failed 27844 1726882745.18804: getting the remaining hosts for this loop 27844 1726882745.18806: done getting the remaining hosts for this loop 27844 1726882745.18809: getting the next task for host managed_node1 27844 1726882745.18816: done getting next task for host managed_node1 27844 1726882745.18819: ^ task is: TASK: Show current_interfaces 27844 1726882745.18821: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.18825: getting variables 27844 1726882745.18827: in VariableManager get_vars() 27844 1726882745.18869: Calling all_inventory to load vars for managed_node1 27844 1726882745.18871: Calling groups_inventory to load vars for managed_node1 27844 1726882745.18874: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.18885: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.18889: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.18892: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.19060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.19260: done with get_vars() 27844 1726882745.19275: done getting variables 27844 1726882745.19309: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000014f 27844 1726882745.19312: WORKER PROCESS EXITING 27844 1726882745.19384: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:05 -0400 (0:00:00.051) 0:00:04.271 ****** 27844 1726882745.19527: entering _queue_task() for managed_node1/debug 27844 1726882745.19529: Creating lock for debug 27844 1726882745.20015: worker is 1 (out of 1 available) 27844 1726882745.20028: exiting _queue_task() for managed_node1/debug 27844 1726882745.20040: done queuing things up, now waiting for results queue to drain 27844 1726882745.20042: waiting for pending results... 27844 1726882745.20803: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 27844 1726882745.20911: in run() - task 0e448fcc-3ce9-efa9-466a-000000000136 27844 1726882745.20937: variable 'ansible_search_path' from source: unknown 27844 1726882745.21037: variable 'ansible_search_path' from source: unknown 27844 1726882745.21083: calling self._execute() 27844 1726882745.21280: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.21291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.21304: variable 'omit' from source: magic vars 27844 1726882745.22041: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.22131: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.22142: variable 'omit' from source: magic vars 27844 1726882745.22185: variable 'omit' from source: magic vars 27844 1726882745.22315: variable 'current_interfaces' from source: set_fact 27844 1726882745.22473: variable 'omit' from source: magic vars 27844 1726882745.22517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.22587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.22682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.22705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.22720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.22799: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.22889: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.22899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.23115: Set connection var ansible_shell_type to sh 27844 1726882745.23123: Set connection var ansible_connection to ssh 27844 1726882745.23133: Set connection var ansible_pipelining to False 27844 1726882745.23143: Set connection var ansible_timeout to 10 27844 1726882745.23152: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.23162: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.23194: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.23208: variable 'ansible_connection' from source: unknown 27844 1726882745.23321: variable 'ansible_module_compression' from source: unknown 27844 1726882745.23330: variable 'ansible_shell_type' from source: unknown 27844 1726882745.23338: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.23345: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.23353: variable 'ansible_pipelining' from source: unknown 27844 1726882745.23359: variable 'ansible_timeout' from source: unknown 27844 1726882745.23370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.23514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.23653: variable 'omit' from source: magic vars 27844 1726882745.23662: starting attempt loop 27844 1726882745.23673: running the handler 27844 1726882745.23725: handler run complete 27844 1726882745.23876: attempt loop complete, returning result 27844 1726882745.23884: _execute() done 27844 1726882745.23890: dumping result to json 27844 1726882745.23898: done dumping result, returning 27844 1726882745.23909: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-efa9-466a-000000000136] 27844 1726882745.23918: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000136 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 27844 1726882745.24072: no more pending results, returning what we have 27844 1726882745.24077: results queue empty 27844 1726882745.24078: checking for any_errors_fatal 27844 1726882745.24083: done checking for any_errors_fatal 27844 1726882745.24083: checking for max_fail_percentage 27844 1726882745.24085: done checking for max_fail_percentage 27844 1726882745.24085: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.24087: done checking to see if all hosts have failed 27844 1726882745.24087: getting the remaining hosts for this loop 27844 1726882745.24089: done getting the remaining hosts for this loop 27844 1726882745.24093: getting the next task for host managed_node1 27844 1726882745.24099: done getting next task for host managed_node1 27844 1726882745.24103: ^ task is: TASK: Manage test interface 27844 1726882745.24105: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.24108: getting variables 27844 1726882745.24109: in VariableManager get_vars() 27844 1726882745.24152: Calling all_inventory to load vars for managed_node1 27844 1726882745.24155: Calling groups_inventory to load vars for managed_node1 27844 1726882745.24158: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.24175: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.24178: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.24182: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.24381: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000136 27844 1726882745.24384: WORKER PROCESS EXITING 27844 1726882745.24406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.24604: done with get_vars() 27844 1726882745.24613: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Friday 20 September 2024 21:39:05 -0400 (0:00:00.052) 0:00:04.324 ****** 27844 1726882745.24778: entering _queue_task() for managed_node1/include_tasks 27844 1726882745.25253: worker is 1 (out of 1 available) 27844 1726882745.25270: exiting _queue_task() for managed_node1/include_tasks 27844 1726882745.25282: done queuing things up, now waiting for results queue to drain 27844 1726882745.25283: waiting for pending results... 27844 1726882745.25530: running TaskExecutor() for managed_node1/TASK: Manage test interface 27844 1726882745.25621: in run() - task 0e448fcc-3ce9-efa9-466a-00000000000d 27844 1726882745.25639: variable 'ansible_search_path' from source: unknown 27844 1726882745.25687: calling self._execute() 27844 1726882745.25814: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.25822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.25836: variable 'omit' from source: magic vars 27844 1726882745.26232: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.26250: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.26261: _execute() done 27844 1726882745.26277: dumping result to json 27844 1726882745.26285: done dumping result, returning 27844 1726882745.26295: done running TaskExecutor() for managed_node1/TASK: Manage test interface [0e448fcc-3ce9-efa9-466a-00000000000d] 27844 1726882745.26304: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000d 27844 1726882745.26431: no more pending results, returning what we have 27844 1726882745.26437: in VariableManager get_vars() 27844 1726882745.26491: Calling all_inventory to load vars for managed_node1 27844 1726882745.26494: Calling groups_inventory to load vars for managed_node1 27844 1726882745.26497: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.26513: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.26516: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.26519: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.26711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.27000: done with get_vars() 27844 1726882745.27007: variable 'ansible_search_path' from source: unknown 27844 1726882745.27021: we have included files to process 27844 1726882745.27025: generating all_blocks data 27844 1726882745.27027: done generating all_blocks data 27844 1726882745.27034: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882745.27035: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882745.27038: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882745.27723: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000d 27844 1726882745.27727: WORKER PROCESS EXITING 27844 1726882745.28008: in VariableManager get_vars() 27844 1726882745.28030: done with get_vars() 27844 1726882745.28252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 27844 1726882745.28822: done processing included file 27844 1726882745.28824: iterating over new_blocks loaded from include file 27844 1726882745.28825: in VariableManager get_vars() 27844 1726882745.28845: done with get_vars() 27844 1726882745.28847: filtering new block on tags 27844 1726882745.28879: done filtering new block on tags 27844 1726882745.28892: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 27844 1726882745.28897: extending task lists for all hosts with included blocks 27844 1726882745.29218: done extending task lists 27844 1726882745.29219: done processing included files 27844 1726882745.29220: results queue empty 27844 1726882745.29220: checking for any_errors_fatal 27844 1726882745.29223: done checking for any_errors_fatal 27844 1726882745.29224: checking for max_fail_percentage 27844 1726882745.29225: done checking for max_fail_percentage 27844 1726882745.29226: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.29227: done checking to see if all hosts have failed 27844 1726882745.29227: getting the remaining hosts for this loop 27844 1726882745.29228: done getting the remaining hosts for this loop 27844 1726882745.29231: getting the next task for host managed_node1 27844 1726882745.29391: done getting next task for host managed_node1 27844 1726882745.29394: ^ task is: TASK: Ensure state in ["present", "absent"] 27844 1726882745.29397: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.29399: getting variables 27844 1726882745.29400: in VariableManager get_vars() 27844 1726882745.29413: Calling all_inventory to load vars for managed_node1 27844 1726882745.29415: Calling groups_inventory to load vars for managed_node1 27844 1726882745.29417: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.29423: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.29425: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.29428: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.29577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.29793: done with get_vars() 27844 1726882745.29802: done getting variables 27844 1726882745.29872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:05 -0400 (0:00:00.051) 0:00:04.375 ****** 27844 1726882745.29900: entering _queue_task() for managed_node1/fail 27844 1726882745.29902: Creating lock for fail 27844 1726882745.30176: worker is 1 (out of 1 available) 27844 1726882745.30189: exiting _queue_task() for managed_node1/fail 27844 1726882745.30202: done queuing things up, now waiting for results queue to drain 27844 1726882745.30203: waiting for pending results... 27844 1726882745.31216: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 27844 1726882745.31326: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016a 27844 1726882745.31353: variable 'ansible_search_path' from source: unknown 27844 1726882745.31363: variable 'ansible_search_path' from source: unknown 27844 1726882745.31410: calling self._execute() 27844 1726882745.31506: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.31517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.31530: variable 'omit' from source: magic vars 27844 1726882745.31919: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.31939: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.32090: variable 'state' from source: include params 27844 1726882745.32106: Evaluated conditional (state not in ["present", "absent"]): False 27844 1726882745.32112: when evaluation is False, skipping this task 27844 1726882745.32117: _execute() done 27844 1726882745.32121: dumping result to json 27844 1726882745.32127: done dumping result, returning 27844 1726882745.32139: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-efa9-466a-00000000016a] 27844 1726882745.32148: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016a skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27844 1726882745.32288: no more pending results, returning what we have 27844 1726882745.32292: results queue empty 27844 1726882745.32293: checking for any_errors_fatal 27844 1726882745.32295: done checking for any_errors_fatal 27844 1726882745.32295: checking for max_fail_percentage 27844 1726882745.32297: done checking for max_fail_percentage 27844 1726882745.32298: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.32299: done checking to see if all hosts have failed 27844 1726882745.32299: getting the remaining hosts for this loop 27844 1726882745.32301: done getting the remaining hosts for this loop 27844 1726882745.32304: getting the next task for host managed_node1 27844 1726882745.32310: done getting next task for host managed_node1 27844 1726882745.32313: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27844 1726882745.32316: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.32319: getting variables 27844 1726882745.32321: in VariableManager get_vars() 27844 1726882745.32361: Calling all_inventory to load vars for managed_node1 27844 1726882745.32367: Calling groups_inventory to load vars for managed_node1 27844 1726882745.32370: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.32382: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.32385: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.32387: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.32599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.32817: done with get_vars() 27844 1726882745.32826: done getting variables 27844 1726882745.33512: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016a 27844 1726882745.33515: WORKER PROCESS EXITING 27844 1726882745.33536: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:05 -0400 (0:00:00.036) 0:00:04.412 ****** 27844 1726882745.33577: entering _queue_task() for managed_node1/fail 27844 1726882745.33954: worker is 1 (out of 1 available) 27844 1726882745.33969: exiting _queue_task() for managed_node1/fail 27844 1726882745.33980: done queuing things up, now waiting for results queue to drain 27844 1726882745.33982: waiting for pending results... 27844 1726882745.34231: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 27844 1726882745.34337: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016b 27844 1726882745.34362: variable 'ansible_search_path' from source: unknown 27844 1726882745.34374: variable 'ansible_search_path' from source: unknown 27844 1726882745.34412: calling self._execute() 27844 1726882745.34500: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.34510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.34522: variable 'omit' from source: magic vars 27844 1726882745.34931: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.34950: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.35106: variable 'type' from source: set_fact 27844 1726882745.35123: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27844 1726882745.35131: when evaluation is False, skipping this task 27844 1726882745.35138: _execute() done 27844 1726882745.35145: dumping result to json 27844 1726882745.35153: done dumping result, returning 27844 1726882745.35162: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-efa9-466a-00000000016b] 27844 1726882745.35176: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016b 27844 1726882745.35268: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016b 27844 1726882745.35275: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27844 1726882745.35330: no more pending results, returning what we have 27844 1726882745.35335: results queue empty 27844 1726882745.35336: checking for any_errors_fatal 27844 1726882745.35341: done checking for any_errors_fatal 27844 1726882745.35342: checking for max_fail_percentage 27844 1726882745.35343: done checking for max_fail_percentage 27844 1726882745.35344: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.35345: done checking to see if all hosts have failed 27844 1726882745.35346: getting the remaining hosts for this loop 27844 1726882745.35347: done getting the remaining hosts for this loop 27844 1726882745.35350: getting the next task for host managed_node1 27844 1726882745.35356: done getting next task for host managed_node1 27844 1726882745.35358: ^ task is: TASK: Include the task 'show_interfaces.yml' 27844 1726882745.35361: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.35368: getting variables 27844 1726882745.35369: in VariableManager get_vars() 27844 1726882745.35411: Calling all_inventory to load vars for managed_node1 27844 1726882745.35414: Calling groups_inventory to load vars for managed_node1 27844 1726882745.35417: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.35430: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.35433: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.35436: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.35609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.35825: done with get_vars() 27844 1726882745.35835: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:05 -0400 (0:00:00.023) 0:00:04.436 ****** 27844 1726882745.35960: entering _queue_task() for managed_node1/include_tasks 27844 1726882745.36393: worker is 1 (out of 1 available) 27844 1726882745.36404: exiting _queue_task() for managed_node1/include_tasks 27844 1726882745.36416: done queuing things up, now waiting for results queue to drain 27844 1726882745.36417: waiting for pending results... 27844 1726882745.37303: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 27844 1726882745.37480: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016c 27844 1726882745.37499: variable 'ansible_search_path' from source: unknown 27844 1726882745.37508: variable 'ansible_search_path' from source: unknown 27844 1726882745.37596: calling self._execute() 27844 1726882745.37692: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.37704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.37737: variable 'omit' from source: magic vars 27844 1726882745.38180: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.38203: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.38214: _execute() done 27844 1726882745.38222: dumping result to json 27844 1726882745.38231: done dumping result, returning 27844 1726882745.38241: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-efa9-466a-00000000016c] 27844 1726882745.38251: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016c 27844 1726882745.38380: no more pending results, returning what we have 27844 1726882745.38386: in VariableManager get_vars() 27844 1726882745.38428: Calling all_inventory to load vars for managed_node1 27844 1726882745.38431: Calling groups_inventory to load vars for managed_node1 27844 1726882745.38433: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.38446: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.38449: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.38451: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.38667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.38883: done with get_vars() 27844 1726882745.38889: variable 'ansible_search_path' from source: unknown 27844 1726882745.38890: variable 'ansible_search_path' from source: unknown 27844 1726882745.38926: we have included files to process 27844 1726882745.38927: generating all_blocks data 27844 1726882745.38930: done generating all_blocks data 27844 1726882745.38936: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882745.38938: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882745.38940: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882745.39158: in VariableManager get_vars() 27844 1726882745.39182: done with get_vars() 27844 1726882745.39342: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016c 27844 1726882745.39345: WORKER PROCESS EXITING 27844 1726882745.39458: done processing included file 27844 1726882745.39460: iterating over new_blocks loaded from include file 27844 1726882745.39461: in VariableManager get_vars() 27844 1726882745.39484: done with get_vars() 27844 1726882745.39486: filtering new block on tags 27844 1726882745.39512: done filtering new block on tags 27844 1726882745.39515: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 27844 1726882745.39538: extending task lists for all hosts with included blocks 27844 1726882745.41246: done extending task lists 27844 1726882745.41247: done processing included files 27844 1726882745.41248: results queue empty 27844 1726882745.41248: checking for any_errors_fatal 27844 1726882745.41251: done checking for any_errors_fatal 27844 1726882745.41251: checking for max_fail_percentage 27844 1726882745.41252: done checking for max_fail_percentage 27844 1726882745.41253: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.41254: done checking to see if all hosts have failed 27844 1726882745.41254: getting the remaining hosts for this loop 27844 1726882745.41255: done getting the remaining hosts for this loop 27844 1726882745.41257: getting the next task for host managed_node1 27844 1726882745.41261: done getting next task for host managed_node1 27844 1726882745.41263: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27844 1726882745.41270: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.41273: getting variables 27844 1726882745.41274: in VariableManager get_vars() 27844 1726882745.41286: Calling all_inventory to load vars for managed_node1 27844 1726882745.41288: Calling groups_inventory to load vars for managed_node1 27844 1726882745.41290: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.41295: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.41297: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.41300: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.41457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.42037: done with get_vars() 27844 1726882745.42046: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:05 -0400 (0:00:00.061) 0:00:04.497 ****** 27844 1726882745.42130: entering _queue_task() for managed_node1/include_tasks 27844 1726882745.42409: worker is 1 (out of 1 available) 27844 1726882745.42425: exiting _queue_task() for managed_node1/include_tasks 27844 1726882745.42437: done queuing things up, now waiting for results queue to drain 27844 1726882745.42438: waiting for pending results... 27844 1726882745.42708: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 27844 1726882745.42835: in run() - task 0e448fcc-3ce9-efa9-466a-00000000019d 27844 1726882745.42856: variable 'ansible_search_path' from source: unknown 27844 1726882745.42869: variable 'ansible_search_path' from source: unknown 27844 1726882745.42911: calling self._execute() 27844 1726882745.43005: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.43016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.43028: variable 'omit' from source: magic vars 27844 1726882745.43433: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.43453: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.43468: _execute() done 27844 1726882745.43478: dumping result to json 27844 1726882745.43492: done dumping result, returning 27844 1726882745.43504: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-efa9-466a-00000000019d] 27844 1726882745.43517: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000019d 27844 1726882745.43667: no more pending results, returning what we have 27844 1726882745.43673: in VariableManager get_vars() 27844 1726882745.43720: Calling all_inventory to load vars for managed_node1 27844 1726882745.43724: Calling groups_inventory to load vars for managed_node1 27844 1726882745.43726: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.43742: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.43745: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.43748: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.43936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.44261: done with get_vars() 27844 1726882745.44274: variable 'ansible_search_path' from source: unknown 27844 1726882745.44275: variable 'ansible_search_path' from source: unknown 27844 1726882745.44345: we have included files to process 27844 1726882745.44352: generating all_blocks data 27844 1726882745.44354: done generating all_blocks data 27844 1726882745.44357: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882745.44358: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882745.44360: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882745.44639: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000019d 27844 1726882745.44643: WORKER PROCESS EXITING 27844 1726882745.44889: done processing included file 27844 1726882745.44891: iterating over new_blocks loaded from include file 27844 1726882745.44893: in VariableManager get_vars() 27844 1726882745.44913: done with get_vars() 27844 1726882745.44915: filtering new block on tags 27844 1726882745.44932: done filtering new block on tags 27844 1726882745.44935: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 27844 1726882745.44939: extending task lists for all hosts with included blocks 27844 1726882745.45108: done extending task lists 27844 1726882745.45109: done processing included files 27844 1726882745.45110: results queue empty 27844 1726882745.45111: checking for any_errors_fatal 27844 1726882745.45113: done checking for any_errors_fatal 27844 1726882745.45114: checking for max_fail_percentage 27844 1726882745.45115: done checking for max_fail_percentage 27844 1726882745.45116: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.45117: done checking to see if all hosts have failed 27844 1726882745.45117: getting the remaining hosts for this loop 27844 1726882745.45119: done getting the remaining hosts for this loop 27844 1726882745.45121: getting the next task for host managed_node1 27844 1726882745.45126: done getting next task for host managed_node1 27844 1726882745.45128: ^ task is: TASK: Gather current interface info 27844 1726882745.45131: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.45133: getting variables 27844 1726882745.45134: in VariableManager get_vars() 27844 1726882745.45147: Calling all_inventory to load vars for managed_node1 27844 1726882745.45149: Calling groups_inventory to load vars for managed_node1 27844 1726882745.45151: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.45155: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.45158: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.45160: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.45332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.45543: done with get_vars() 27844 1726882745.45550: done getting variables 27844 1726882745.45590: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:05 -0400 (0:00:00.034) 0:00:04.532 ****** 27844 1726882745.45622: entering _queue_task() for managed_node1/command 27844 1726882745.45848: worker is 1 (out of 1 available) 27844 1726882745.45859: exiting _queue_task() for managed_node1/command 27844 1726882745.45874: done queuing things up, now waiting for results queue to drain 27844 1726882745.45876: waiting for pending results... 27844 1726882745.46187: running TaskExecutor() for managed_node1/TASK: Gather current interface info 27844 1726882745.46307: in run() - task 0e448fcc-3ce9-efa9-466a-0000000001d4 27844 1726882745.46331: variable 'ansible_search_path' from source: unknown 27844 1726882745.46340: variable 'ansible_search_path' from source: unknown 27844 1726882745.46385: calling self._execute() 27844 1726882745.46473: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.46489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.46503: variable 'omit' from source: magic vars 27844 1726882745.46889: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.46908: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.46924: variable 'omit' from source: magic vars 27844 1726882745.46985: variable 'omit' from source: magic vars 27844 1726882745.47029: variable 'omit' from source: magic vars 27844 1726882745.47074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.47116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.47146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.47172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.47194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.47225: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.47234: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.47248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.47357: Set connection var ansible_shell_type to sh 27844 1726882745.47370: Set connection var ansible_connection to ssh 27844 1726882745.47380: Set connection var ansible_pipelining to False 27844 1726882745.47389: Set connection var ansible_timeout to 10 27844 1726882745.47398: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.47410: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.47438: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.47446: variable 'ansible_connection' from source: unknown 27844 1726882745.47453: variable 'ansible_module_compression' from source: unknown 27844 1726882745.47469: variable 'ansible_shell_type' from source: unknown 27844 1726882745.47479: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.47486: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.47494: variable 'ansible_pipelining' from source: unknown 27844 1726882745.47501: variable 'ansible_timeout' from source: unknown 27844 1726882745.47511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.47651: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.47673: variable 'omit' from source: magic vars 27844 1726882745.47689: starting attempt loop 27844 1726882745.47697: running the handler 27844 1726882745.47723: _low_level_execute_command(): starting 27844 1726882745.47741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882745.48607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.48624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.48640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.48660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.48711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.48728: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.48743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.48768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.48785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.48797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.48810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.48823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.48842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.48854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.48868: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.48883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.48971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.48993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.49020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.49190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.50822: stdout chunk (state=3): >>>/root <<< 27844 1726882745.51011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.51014: stdout chunk (state=3): >>><<< 27844 1726882745.51016: stderr chunk (state=3): >>><<< 27844 1726882745.51130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.51135: _low_level_execute_command(): starting 27844 1726882745.51139: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586 `" && echo ansible-tmp-1726882745.5103493-28118-227736315813586="` echo /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586 `" ) && sleep 0' 27844 1726882745.51726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.51739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.51752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.51772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.51822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.51833: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.51845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.51860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.51877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.51891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.51901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.51914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.51927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.51935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.51943: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.51953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.52041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.52058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.52078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.52204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.54067: stdout chunk (state=3): >>>ansible-tmp-1726882745.5103493-28118-227736315813586=/root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586 <<< 27844 1726882745.54234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.54238: stdout chunk (state=3): >>><<< 27844 1726882745.54245: stderr chunk (state=3): >>><<< 27844 1726882745.54263: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882745.5103493-28118-227736315813586=/root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.54300: variable 'ansible_module_compression' from source: unknown 27844 1726882745.54347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882745.54385: variable 'ansible_facts' from source: unknown 27844 1726882745.54463: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/AnsiballZ_command.py 27844 1726882745.54601: Sending initial data 27844 1726882745.54605: Sent initial data (156 bytes) 27844 1726882745.55525: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.55534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.55545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.55560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.55598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.55605: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.55615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.55627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.55634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.55640: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.55647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.55655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.55668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.55678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.55682: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.55691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.55760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.55779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.55788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.55910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.57622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882745.57707: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882745.57804: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpg7k0dloe /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/AnsiballZ_command.py <<< 27844 1726882745.57898: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882745.59281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.59471: stderr chunk (state=3): >>><<< 27844 1726882745.59476: stdout chunk (state=3): >>><<< 27844 1726882745.59478: done transferring module to remote 27844 1726882745.59485: _low_level_execute_command(): starting 27844 1726882745.59487: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/ /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/AnsiballZ_command.py && sleep 0' 27844 1726882745.60125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.60137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.60148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.60162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.60206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.60216: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.60227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.60241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.60250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.60259: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.60275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.60287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.60303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.60313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.60321: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.60331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.60413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.60429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.60441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.60562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.62332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.62337: stdout chunk (state=3): >>><<< 27844 1726882745.62345: stderr chunk (state=3): >>><<< 27844 1726882745.62359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.62362: _low_level_execute_command(): starting 27844 1726882745.62370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/AnsiballZ_command.py && sleep 0' 27844 1726882745.63007: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882745.63016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.63027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.63039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.63080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.63089: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882745.63105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.63119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882745.63126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882745.63133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882745.63140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.63154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.63159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.63169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882745.63175: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882745.63184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.63260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.63295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.63298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.63427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.76871: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:05.763791", "end": "2024-09-20 21:39:05.767131", "delta": "0:00:00.003340", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882745.78186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882745.78207: stderr chunk (state=3): >>><<< 27844 1726882745.78211: stdout chunk (state=3): >>><<< 27844 1726882745.78240: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:05.763791", "end": "2024-09-20 21:39:05.767131", "delta": "0:00:00.003340", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882745.78274: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882745.78280: _low_level_execute_command(): starting 27844 1726882745.78285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882745.5103493-28118-227736315813586/ > /dev/null 2>&1 && sleep 0' 27844 1726882745.78735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.78740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.78795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.78798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882745.78800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.78803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.78850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.78855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882745.78869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.78978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.80799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.80848: stderr chunk (state=3): >>><<< 27844 1726882745.80851: stdout chunk (state=3): >>><<< 27844 1726882745.80866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.80874: handler run complete 27844 1726882745.80892: Evaluated conditional (False): False 27844 1726882745.80900: attempt loop complete, returning result 27844 1726882745.80903: _execute() done 27844 1726882745.80905: dumping result to json 27844 1726882745.80909: done dumping result, returning 27844 1726882745.80917: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-efa9-466a-0000000001d4] 27844 1726882745.80925: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000001d4 27844 1726882745.81023: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000001d4 27844 1726882745.81027: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003340", "end": "2024-09-20 21:39:05.767131", "rc": 0, "start": "2024-09-20 21:39:05.763791" } STDOUT: bonding_masters eth0 lo 27844 1726882745.81133: no more pending results, returning what we have 27844 1726882745.81137: results queue empty 27844 1726882745.81138: checking for any_errors_fatal 27844 1726882745.81140: done checking for any_errors_fatal 27844 1726882745.81141: checking for max_fail_percentage 27844 1726882745.81142: done checking for max_fail_percentage 27844 1726882745.81143: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.81144: done checking to see if all hosts have failed 27844 1726882745.81144: getting the remaining hosts for this loop 27844 1726882745.81145: done getting the remaining hosts for this loop 27844 1726882745.81149: getting the next task for host managed_node1 27844 1726882745.81154: done getting next task for host managed_node1 27844 1726882745.81156: ^ task is: TASK: Set current_interfaces 27844 1726882745.81160: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.81166: getting variables 27844 1726882745.81167: in VariableManager get_vars() 27844 1726882745.81203: Calling all_inventory to load vars for managed_node1 27844 1726882745.81206: Calling groups_inventory to load vars for managed_node1 27844 1726882745.81208: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.81218: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.81220: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.81222: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.81338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.81461: done with get_vars() 27844 1726882745.81474: done getting variables 27844 1726882745.81516: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:05 -0400 (0:00:00.359) 0:00:04.892 ****** 27844 1726882745.81542: entering _queue_task() for managed_node1/set_fact 27844 1726882745.81732: worker is 1 (out of 1 available) 27844 1726882745.81745: exiting _queue_task() for managed_node1/set_fact 27844 1726882745.81757: done queuing things up, now waiting for results queue to drain 27844 1726882745.81759: waiting for pending results... 27844 1726882745.81906: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 27844 1726882745.81970: in run() - task 0e448fcc-3ce9-efa9-466a-0000000001d5 27844 1726882745.81984: variable 'ansible_search_path' from source: unknown 27844 1726882745.81988: variable 'ansible_search_path' from source: unknown 27844 1726882745.82020: calling self._execute() 27844 1726882745.82090: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.82094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.82102: variable 'omit' from source: magic vars 27844 1726882745.82415: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.82426: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.82433: variable 'omit' from source: magic vars 27844 1726882745.82477: variable 'omit' from source: magic vars 27844 1726882745.82548: variable '_current_interfaces' from source: set_fact 27844 1726882745.82601: variable 'omit' from source: magic vars 27844 1726882745.82635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.82675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.82691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.82703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.82712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.82733: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.82736: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.82739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.82809: Set connection var ansible_shell_type to sh 27844 1726882745.82812: Set connection var ansible_connection to ssh 27844 1726882745.82814: Set connection var ansible_pipelining to False 27844 1726882745.82821: Set connection var ansible_timeout to 10 27844 1726882745.82826: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.82830: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.82850: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.82853: variable 'ansible_connection' from source: unknown 27844 1726882745.82855: variable 'ansible_module_compression' from source: unknown 27844 1726882745.82858: variable 'ansible_shell_type' from source: unknown 27844 1726882745.82860: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.82862: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.82868: variable 'ansible_pipelining' from source: unknown 27844 1726882745.82871: variable 'ansible_timeout' from source: unknown 27844 1726882745.82873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.82973: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.82983: variable 'omit' from source: magic vars 27844 1726882745.82986: starting attempt loop 27844 1726882745.82989: running the handler 27844 1726882745.83000: handler run complete 27844 1726882745.83008: attempt loop complete, returning result 27844 1726882745.83010: _execute() done 27844 1726882745.83013: dumping result to json 27844 1726882745.83016: done dumping result, returning 27844 1726882745.83022: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-efa9-466a-0000000001d5] 27844 1726882745.83026: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000001d5 27844 1726882745.83107: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000001d5 27844 1726882745.83110: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 27844 1726882745.83161: no more pending results, returning what we have 27844 1726882745.83169: results queue empty 27844 1726882745.83170: checking for any_errors_fatal 27844 1726882745.83176: done checking for any_errors_fatal 27844 1726882745.83177: checking for max_fail_percentage 27844 1726882745.83178: done checking for max_fail_percentage 27844 1726882745.83179: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.83180: done checking to see if all hosts have failed 27844 1726882745.83180: getting the remaining hosts for this loop 27844 1726882745.83182: done getting the remaining hosts for this loop 27844 1726882745.83184: getting the next task for host managed_node1 27844 1726882745.83190: done getting next task for host managed_node1 27844 1726882745.83192: ^ task is: TASK: Show current_interfaces 27844 1726882745.83196: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.83198: getting variables 27844 1726882745.83200: in VariableManager get_vars() 27844 1726882745.83233: Calling all_inventory to load vars for managed_node1 27844 1726882745.83235: Calling groups_inventory to load vars for managed_node1 27844 1726882745.83236: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.83243: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.83245: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.83246: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.83382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.83503: done with get_vars() 27844 1726882745.83509: done getting variables 27844 1726882745.83547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:05 -0400 (0:00:00.020) 0:00:04.912 ****** 27844 1726882745.83571: entering _queue_task() for managed_node1/debug 27844 1726882745.83737: worker is 1 (out of 1 available) 27844 1726882745.83751: exiting _queue_task() for managed_node1/debug 27844 1726882745.83762: done queuing things up, now waiting for results queue to drain 27844 1726882745.83767: waiting for pending results... 27844 1726882745.83897: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 27844 1726882745.83958: in run() - task 0e448fcc-3ce9-efa9-466a-00000000019e 27844 1726882745.83974: variable 'ansible_search_path' from source: unknown 27844 1726882745.83982: variable 'ansible_search_path' from source: unknown 27844 1726882745.84018: calling self._execute() 27844 1726882745.84091: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.84100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.84110: variable 'omit' from source: magic vars 27844 1726882745.84415: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.84430: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.84438: variable 'omit' from source: magic vars 27844 1726882745.84480: variable 'omit' from source: magic vars 27844 1726882745.84566: variable 'current_interfaces' from source: set_fact 27844 1726882745.84593: variable 'omit' from source: magic vars 27844 1726882745.84629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.84660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.84683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.84701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.84715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.84740: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.84747: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.84752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.84836: Set connection var ansible_shell_type to sh 27844 1726882745.84842: Set connection var ansible_connection to ssh 27844 1726882745.84850: Set connection var ansible_pipelining to False 27844 1726882745.84858: Set connection var ansible_timeout to 10 27844 1726882745.84871: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.84878: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.84902: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.84908: variable 'ansible_connection' from source: unknown 27844 1726882745.84913: variable 'ansible_module_compression' from source: unknown 27844 1726882745.84917: variable 'ansible_shell_type' from source: unknown 27844 1726882745.84923: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.84929: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.84935: variable 'ansible_pipelining' from source: unknown 27844 1726882745.84940: variable 'ansible_timeout' from source: unknown 27844 1726882745.84946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.85054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.85071: variable 'omit' from source: magic vars 27844 1726882745.85080: starting attempt loop 27844 1726882745.85085: running the handler 27844 1726882745.85126: handler run complete 27844 1726882745.85144: attempt loop complete, returning result 27844 1726882745.85147: _execute() done 27844 1726882745.85149: dumping result to json 27844 1726882745.85151: done dumping result, returning 27844 1726882745.85157: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-efa9-466a-00000000019e] 27844 1726882745.85166: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000019e 27844 1726882745.85244: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000019e ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 27844 1726882745.85296: no more pending results, returning what we have 27844 1726882745.85299: results queue empty 27844 1726882745.85300: checking for any_errors_fatal 27844 1726882745.85306: done checking for any_errors_fatal 27844 1726882745.85307: checking for max_fail_percentage 27844 1726882745.85308: done checking for max_fail_percentage 27844 1726882745.85309: checking to see if all hosts have failed and the running result is not ok 27844 1726882745.85310: done checking to see if all hosts have failed 27844 1726882745.85311: getting the remaining hosts for this loop 27844 1726882745.85312: done getting the remaining hosts for this loop 27844 1726882745.85315: getting the next task for host managed_node1 27844 1726882745.85322: done getting next task for host managed_node1 27844 1726882745.85325: ^ task is: TASK: Install iproute 27844 1726882745.85327: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882745.85331: getting variables 27844 1726882745.85332: in VariableManager get_vars() 27844 1726882745.85372: Calling all_inventory to load vars for managed_node1 27844 1726882745.85375: Calling groups_inventory to load vars for managed_node1 27844 1726882745.85377: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882745.85387: Calling all_plugins_play to load vars for managed_node1 27844 1726882745.85389: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882745.85393: Calling groups_plugins_play to load vars for managed_node1 27844 1726882745.85538: WORKER PROCESS EXITING 27844 1726882745.85553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882745.85738: done with get_vars() 27844 1726882745.85746: done getting variables 27844 1726882745.85798: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:05 -0400 (0:00:00.022) 0:00:04.934 ****** 27844 1726882745.85820: entering _queue_task() for managed_node1/package 27844 1726882745.86019: worker is 1 (out of 1 available) 27844 1726882745.86031: exiting _queue_task() for managed_node1/package 27844 1726882745.86042: done queuing things up, now waiting for results queue to drain 27844 1726882745.86044: waiting for pending results... 27844 1726882745.86276: running TaskExecutor() for managed_node1/TASK: Install iproute 27844 1726882745.86343: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016d 27844 1726882745.86354: variable 'ansible_search_path' from source: unknown 27844 1726882745.86358: variable 'ansible_search_path' from source: unknown 27844 1726882745.86393: calling self._execute() 27844 1726882745.86448: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.86451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.86459: variable 'omit' from source: magic vars 27844 1726882745.86757: variable 'ansible_distribution_major_version' from source: facts 27844 1726882745.86771: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882745.86776: variable 'omit' from source: magic vars 27844 1726882745.86800: variable 'omit' from source: magic vars 27844 1726882745.86924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882745.88353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882745.88406: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882745.88431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882745.88457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882745.88485: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882745.88546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882745.88571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882745.88591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882745.88616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882745.88627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882745.88699: variable '__network_is_ostree' from source: set_fact 27844 1726882745.88702: variable 'omit' from source: magic vars 27844 1726882745.88722: variable 'omit' from source: magic vars 27844 1726882745.88742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882745.88761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882745.88779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882745.88798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.88805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882745.88825: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882745.88828: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.88830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.88896: Set connection var ansible_shell_type to sh 27844 1726882745.88899: Set connection var ansible_connection to ssh 27844 1726882745.88902: Set connection var ansible_pipelining to False 27844 1726882745.88904: Set connection var ansible_timeout to 10 27844 1726882745.88909: Set connection var ansible_shell_executable to /bin/sh 27844 1726882745.88914: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882745.88932: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.88934: variable 'ansible_connection' from source: unknown 27844 1726882745.88937: variable 'ansible_module_compression' from source: unknown 27844 1726882745.88939: variable 'ansible_shell_type' from source: unknown 27844 1726882745.88941: variable 'ansible_shell_executable' from source: unknown 27844 1726882745.88943: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882745.88947: variable 'ansible_pipelining' from source: unknown 27844 1726882745.88950: variable 'ansible_timeout' from source: unknown 27844 1726882745.88954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882745.89020: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882745.89028: variable 'omit' from source: magic vars 27844 1726882745.89033: starting attempt loop 27844 1726882745.89036: running the handler 27844 1726882745.89041: variable 'ansible_facts' from source: unknown 27844 1726882745.89043: variable 'ansible_facts' from source: unknown 27844 1726882745.89068: _low_level_execute_command(): starting 27844 1726882745.89080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882745.89569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.89584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.89602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882745.89614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.89624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.89669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.89683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.89793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.91465: stdout chunk (state=3): >>>/root <<< 27844 1726882745.91571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.91615: stderr chunk (state=3): >>><<< 27844 1726882745.91618: stdout chunk (state=3): >>><<< 27844 1726882745.91635: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.91648: _low_level_execute_command(): starting 27844 1726882745.91651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149 `" && echo ansible-tmp-1726882745.9163342-28155-228665358699149="` echo /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149 `" ) && sleep 0' 27844 1726882745.92082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882745.92085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882745.92118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.92122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882745.92124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882745.92168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882745.92182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882745.92288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882745.94143: stdout chunk (state=3): >>>ansible-tmp-1726882745.9163342-28155-228665358699149=/root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149 <<< 27844 1726882745.94259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882745.94307: stderr chunk (state=3): >>><<< 27844 1726882745.94310: stdout chunk (state=3): >>><<< 27844 1726882745.94322: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882745.9163342-28155-228665358699149=/root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882745.94343: variable 'ansible_module_compression' from source: unknown 27844 1726882745.94391: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 27844 1726882745.94395: ANSIBALLZ: Acquiring lock 27844 1726882745.94397: ANSIBALLZ: Lock acquired: 139916607833536 27844 1726882745.94400: ANSIBALLZ: Creating module 27844 1726882746.10385: ANSIBALLZ: Writing module into payload 27844 1726882746.10677: ANSIBALLZ: Writing module 27844 1726882746.10714: ANSIBALLZ: Renaming module 27844 1726882746.10732: ANSIBALLZ: Done creating module 27844 1726882746.10755: variable 'ansible_facts' from source: unknown 27844 1726882746.10856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/AnsiballZ_dnf.py 27844 1726882746.11029: Sending initial data 27844 1726882746.11033: Sent initial data (152 bytes) 27844 1726882746.13788: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882746.13798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.13808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.13822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.13863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.13876: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882746.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.13899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882746.13907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882746.13914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882746.13922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.13931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.13943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.13950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.13957: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882746.13967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.14041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882746.14061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882746.14078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882746.14210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882746.16057: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882746.16143: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882746.16236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpvh1pv3yb /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/AnsiballZ_dnf.py <<< 27844 1726882746.16324: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882746.17985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882746.17989: stdout chunk (state=3): >>><<< 27844 1726882746.17997: stderr chunk (state=3): >>><<< 27844 1726882746.18016: done transferring module to remote 27844 1726882746.18028: _low_level_execute_command(): starting 27844 1726882746.18033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/ /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/AnsiballZ_dnf.py && sleep 0' 27844 1726882746.18674: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882746.18684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.18695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.18708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.18745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.18752: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882746.18762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.18781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882746.18789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882746.18796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882746.18803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.18812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.18824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.18831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.18838: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882746.18847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.18922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882746.18936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882746.18946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882746.19067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882746.20849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882746.20923: stderr chunk (state=3): >>><<< 27844 1726882746.20934: stdout chunk (state=3): >>><<< 27844 1726882746.21031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882746.21035: _low_level_execute_command(): starting 27844 1726882746.21037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/AnsiballZ_dnf.py && sleep 0' 27844 1726882746.21597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882746.21612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.21627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.21644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.21693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.21706: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882746.21720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.21737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882746.21750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882746.21762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882746.21778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882746.21798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882746.21815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882746.21828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882746.21840: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882746.21854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882746.21937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882746.21958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882746.21978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882746.22109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.22041: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27844 1726882747.27767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882747.27772: stdout chunk (state=3): >>><<< 27844 1726882747.27775: stderr chunk (state=3): >>><<< 27844 1726882747.27928: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882747.27938: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882747.27941: _low_level_execute_command(): starting 27844 1726882747.27943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882745.9163342-28155-228665358699149/ > /dev/null 2>&1 && sleep 0' 27844 1726882747.28956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.28967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.28986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.28999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.29038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.29046: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.29055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.29073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882747.29083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882747.29093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882747.29101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.29110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.29121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.29128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.29136: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882747.29143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.29221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.29235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.29249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.29383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.31188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.31266: stderr chunk (state=3): >>><<< 27844 1726882747.31273: stdout chunk (state=3): >>><<< 27844 1726882747.31292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.31299: handler run complete 27844 1726882747.31466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882747.31643: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882747.31685: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882747.31715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882747.31743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882747.31820: variable '__install_status' from source: unknown 27844 1726882747.31844: Evaluated conditional (__install_status is success): True 27844 1726882747.31858: attempt loop complete, returning result 27844 1726882747.31861: _execute() done 27844 1726882747.31865: dumping result to json 27844 1726882747.31868: done dumping result, returning 27844 1726882747.31879: done running TaskExecutor() for managed_node1/TASK: Install iproute [0e448fcc-3ce9-efa9-466a-00000000016d] 27844 1726882747.31884: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016d 27844 1726882747.31990: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016d 27844 1726882747.31992: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27844 1726882747.32155: no more pending results, returning what we have 27844 1726882747.32159: results queue empty 27844 1726882747.32160: checking for any_errors_fatal 27844 1726882747.32176: done checking for any_errors_fatal 27844 1726882747.32178: checking for max_fail_percentage 27844 1726882747.32180: done checking for max_fail_percentage 27844 1726882747.32180: checking to see if all hosts have failed and the running result is not ok 27844 1726882747.32182: done checking to see if all hosts have failed 27844 1726882747.32182: getting the remaining hosts for this loop 27844 1726882747.32184: done getting the remaining hosts for this loop 27844 1726882747.32188: getting the next task for host managed_node1 27844 1726882747.32194: done getting next task for host managed_node1 27844 1726882747.32197: ^ task is: TASK: Create veth interface {{ interface }} 27844 1726882747.32199: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882747.32203: getting variables 27844 1726882747.32204: in VariableManager get_vars() 27844 1726882747.32238: Calling all_inventory to load vars for managed_node1 27844 1726882747.32242: Calling groups_inventory to load vars for managed_node1 27844 1726882747.32244: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882747.32256: Calling all_plugins_play to load vars for managed_node1 27844 1726882747.32259: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882747.32270: Calling groups_plugins_play to load vars for managed_node1 27844 1726882747.32437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882747.32654: done with get_vars() 27844 1726882747.32666: done getting variables 27844 1726882747.32948: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882747.33188: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:07 -0400 (0:00:01.474) 0:00:06.409 ****** 27844 1726882747.33273: entering _queue_task() for managed_node1/command 27844 1726882747.33559: worker is 1 (out of 1 available) 27844 1726882747.33573: exiting _queue_task() for managed_node1/command 27844 1726882747.33594: done queuing things up, now waiting for results queue to drain 27844 1726882747.33596: waiting for pending results... 27844 1726882747.33867: running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 27844 1726882747.33984: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016e 27844 1726882747.34003: variable 'ansible_search_path' from source: unknown 27844 1726882747.34011: variable 'ansible_search_path' from source: unknown 27844 1726882747.34311: variable 'interface' from source: set_fact 27844 1726882747.34406: variable 'interface' from source: set_fact 27844 1726882747.34493: variable 'interface' from source: set_fact 27844 1726882747.34642: Loaded config def from plugin (lookup/items) 27844 1726882747.34654: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27844 1726882747.34694: variable 'omit' from source: magic vars 27844 1726882747.34826: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.34842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.34859: variable 'omit' from source: magic vars 27844 1726882747.35098: variable 'ansible_distribution_major_version' from source: facts 27844 1726882747.35111: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882747.35334: variable 'type' from source: set_fact 27844 1726882747.35356: variable 'state' from source: include params 27844 1726882747.35368: variable 'interface' from source: set_fact 27844 1726882747.35376: variable 'current_interfaces' from source: set_fact 27844 1726882747.35386: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882747.35395: variable 'omit' from source: magic vars 27844 1726882747.35444: variable 'omit' from source: magic vars 27844 1726882747.35492: variable 'item' from source: unknown 27844 1726882747.35539: variable 'item' from source: unknown 27844 1726882747.35552: variable 'omit' from source: magic vars 27844 1726882747.35597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882747.35618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882747.35631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882747.35675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882747.35679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882747.35703: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882747.35706: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.35708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.35774: Set connection var ansible_shell_type to sh 27844 1726882747.35777: Set connection var ansible_connection to ssh 27844 1726882747.35786: Set connection var ansible_pipelining to False 27844 1726882747.35793: Set connection var ansible_timeout to 10 27844 1726882747.35799: Set connection var ansible_shell_executable to /bin/sh 27844 1726882747.35804: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882747.35822: variable 'ansible_shell_executable' from source: unknown 27844 1726882747.35824: variable 'ansible_connection' from source: unknown 27844 1726882747.35827: variable 'ansible_module_compression' from source: unknown 27844 1726882747.35829: variable 'ansible_shell_type' from source: unknown 27844 1726882747.35831: variable 'ansible_shell_executable' from source: unknown 27844 1726882747.35833: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.35838: variable 'ansible_pipelining' from source: unknown 27844 1726882747.35840: variable 'ansible_timeout' from source: unknown 27844 1726882747.35844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.35943: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882747.35959: variable 'omit' from source: magic vars 27844 1726882747.35962: starting attempt loop 27844 1726882747.35967: running the handler 27844 1726882747.35978: _low_level_execute_command(): starting 27844 1726882747.35984: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882747.36480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.36489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.36513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.36526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.36590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.36596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.36700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.38275: stdout chunk (state=3): >>>/root <<< 27844 1726882747.38602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.38606: stdout chunk (state=3): >>><<< 27844 1726882747.38615: stderr chunk (state=3): >>><<< 27844 1726882747.38635: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.38646: _low_level_execute_command(): starting 27844 1726882747.38651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838 `" && echo ansible-tmp-1726882747.3863358-28208-90314214980838="` echo /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838 `" ) && sleep 0' 27844 1726882747.39518: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.39532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.39545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.39569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.39616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.39630: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.39645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.39670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882747.39687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882747.39707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882747.39721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.39737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.39753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.39769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.39781: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882747.39794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.39879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.39895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.39913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.40039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.41889: stdout chunk (state=3): >>>ansible-tmp-1726882747.3863358-28208-90314214980838=/root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838 <<< 27844 1726882747.42003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.42077: stderr chunk (state=3): >>><<< 27844 1726882747.42089: stdout chunk (state=3): >>><<< 27844 1726882747.42171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882747.3863358-28208-90314214980838=/root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.42175: variable 'ansible_module_compression' from source: unknown 27844 1726882747.42369: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882747.42372: variable 'ansible_facts' from source: unknown 27844 1726882747.42375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/AnsiballZ_command.py 27844 1726882747.42508: Sending initial data 27844 1726882747.42511: Sent initial data (155 bytes) 27844 1726882747.43478: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.43492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.43505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.43521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.43562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.43578: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.43591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.43607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882747.43621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882747.43633: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882747.43644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.43659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.43679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.43690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.43700: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882747.43712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.43796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.43816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.43830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.44117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.45678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882747.45768: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882747.45861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpq7aiu7pc /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/AnsiballZ_command.py <<< 27844 1726882747.45954: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882747.47371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.47520: stderr chunk (state=3): >>><<< 27844 1726882747.47523: stdout chunk (state=3): >>><<< 27844 1726882747.47525: done transferring module to remote 27844 1726882747.47531: _low_level_execute_command(): starting 27844 1726882747.47534: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/ /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/AnsiballZ_command.py && sleep 0' 27844 1726882747.48371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.48375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.48422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882747.48425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.48428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.48430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.48499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.48505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.48508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.48626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.50348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.50410: stderr chunk (state=3): >>><<< 27844 1726882747.50413: stdout chunk (state=3): >>><<< 27844 1726882747.50496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.50500: _low_level_execute_command(): starting 27844 1726882747.50502: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/AnsiballZ_command.py && sleep 0' 27844 1726882747.51601: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.51605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.51631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.51654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882747.51657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.51659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.51720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.51732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.51852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.67049: stdout chunk (state=3): >>> <<< 27844 1726882747.67055: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:39:07.647500", "end": "2024-09-20 21:39:07.659675", "delta": "0:00:00.012175", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882747.70824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882747.70913: stderr chunk (state=3): >>><<< 27844 1726882747.70917: stdout chunk (state=3): >>><<< 27844 1726882747.70932: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:39:07.647500", "end": "2024-09-20 21:39:07.659675", "delta": "0:00:00.012175", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882747.70968: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882747.70975: _low_level_execute_command(): starting 27844 1726882747.70986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882747.3863358-28208-90314214980838/ > /dev/null 2>&1 && sleep 0' 27844 1726882747.71743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.71748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.71750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.71752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.71787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.72003: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.72010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.72013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882747.72015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882747.72020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882747.72024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.72026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.72028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.72030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.72031: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882747.72033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.72035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.72037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.72039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.72199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.74641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.74688: stderr chunk (state=3): >>><<< 27844 1726882747.74692: stdout chunk (state=3): >>><<< 27844 1726882747.74704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.74710: handler run complete 27844 1726882747.74727: Evaluated conditional (False): False 27844 1726882747.74735: attempt loop complete, returning result 27844 1726882747.74753: variable 'item' from source: unknown 27844 1726882747.74820: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.012175", "end": "2024-09-20 21:39:07.659675", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:39:07.647500" } 27844 1726882747.74994: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.74997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.74999: variable 'omit' from source: magic vars 27844 1726882747.75067: variable 'ansible_distribution_major_version' from source: facts 27844 1726882747.75074: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882747.75208: variable 'type' from source: set_fact 27844 1726882747.75211: variable 'state' from source: include params 27844 1726882747.75213: variable 'interface' from source: set_fact 27844 1726882747.75227: variable 'current_interfaces' from source: set_fact 27844 1726882747.75230: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882747.75232: variable 'omit' from source: magic vars 27844 1726882747.75241: variable 'omit' from source: magic vars 27844 1726882747.75271: variable 'item' from source: unknown 27844 1726882747.75313: variable 'item' from source: unknown 27844 1726882747.75324: variable 'omit' from source: magic vars 27844 1726882747.75344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882747.75351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882747.75357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882747.75371: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882747.75374: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.75376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.75423: Set connection var ansible_shell_type to sh 27844 1726882747.75426: Set connection var ansible_connection to ssh 27844 1726882747.75429: Set connection var ansible_pipelining to False 27844 1726882747.75442: Set connection var ansible_timeout to 10 27844 1726882747.75448: Set connection var ansible_shell_executable to /bin/sh 27844 1726882747.75451: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882747.75462: variable 'ansible_shell_executable' from source: unknown 27844 1726882747.75466: variable 'ansible_connection' from source: unknown 27844 1726882747.75471: variable 'ansible_module_compression' from source: unknown 27844 1726882747.75474: variable 'ansible_shell_type' from source: unknown 27844 1726882747.75478: variable 'ansible_shell_executable' from source: unknown 27844 1726882747.75480: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882747.75484: variable 'ansible_pipelining' from source: unknown 27844 1726882747.75487: variable 'ansible_timeout' from source: unknown 27844 1726882747.75491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882747.75598: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882747.75604: variable 'omit' from source: magic vars 27844 1726882747.75606: starting attempt loop 27844 1726882747.75609: running the handler 27844 1726882747.75614: _low_level_execute_command(): starting 27844 1726882747.75617: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882747.76234: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.76245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.76262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.76313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.76373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.76381: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.76392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.76406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882747.76412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882747.76419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882747.76430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.76451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.76454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.76466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.76474: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882747.76483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.76600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.76604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.76620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.76736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.78281: stdout chunk (state=3): >>>/root <<< 27844 1726882747.78415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.78419: stdout chunk (state=3): >>><<< 27844 1726882747.78424: stderr chunk (state=3): >>><<< 27844 1726882747.78437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.78444: _low_level_execute_command(): starting 27844 1726882747.78447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326 `" && echo ansible-tmp-1726882747.7843516-28208-14547814573326="` echo /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326 `" ) && sleep 0' 27844 1726882747.79088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.79091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.79129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.79132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.79134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882747.79136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.79185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.79188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.79289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.81136: stdout chunk (state=3): >>>ansible-tmp-1726882747.7843516-28208-14547814573326=/root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326 <<< 27844 1726882747.81248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.81301: stderr chunk (state=3): >>><<< 27844 1726882747.81305: stdout chunk (state=3): >>><<< 27844 1726882747.81316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882747.7843516-28208-14547814573326=/root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.81333: variable 'ansible_module_compression' from source: unknown 27844 1726882747.81361: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882747.81379: variable 'ansible_facts' from source: unknown 27844 1726882747.81426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/AnsiballZ_command.py 27844 1726882747.81525: Sending initial data 27844 1726882747.81532: Sent initial data (155 bytes) 27844 1726882747.82177: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882747.82181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.82193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.82242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.82268: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882747.82281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.82287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882747.82295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.82303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.82312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.82317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.82394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.82402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.82493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.84266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 27844 1726882747.84275: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882747.84358: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 27844 1726882747.84386: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 27844 1726882747.84389: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 27844 1726882747.84484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpkxws5aa9 /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/AnsiballZ_command.py <<< 27844 1726882747.84577: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882747.85671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.85783: stderr chunk (state=3): >>><<< 27844 1726882747.85787: stdout chunk (state=3): >>><<< 27844 1726882747.85800: done transferring module to remote 27844 1726882747.85807: _low_level_execute_command(): starting 27844 1726882747.85811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/ /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/AnsiballZ_command.py && sleep 0' 27844 1726882747.86416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.86440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.86473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.86512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882747.86526: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.86576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882747.86588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.86697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882747.88402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882747.88442: stderr chunk (state=3): >>><<< 27844 1726882747.88446: stdout chunk (state=3): >>><<< 27844 1726882747.88467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882747.88471: _low_level_execute_command(): starting 27844 1726882747.88473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/AnsiballZ_command.py && sleep 0' 27844 1726882747.88894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882747.88897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882747.88926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.88929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882747.88931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882747.88981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882747.88985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882747.89100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.02598: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:39:08.020418", "end": "2024-09-20 21:39:08.024358", "delta": "0:00:00.003940", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882748.03733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882748.03791: stderr chunk (state=3): >>><<< 27844 1726882748.03796: stdout chunk (state=3): >>><<< 27844 1726882748.03815: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:39:08.020418", "end": "2024-09-20 21:39:08.024358", "delta": "0:00:00.003940", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882748.03862: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882748.03868: _low_level_execute_command(): starting 27844 1726882748.03870: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882747.7843516-28208-14547814573326/ > /dev/null 2>&1 && sleep 0' 27844 1726882748.04582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.04632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.04684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.04711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.04722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.04837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.06630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.06680: stderr chunk (state=3): >>><<< 27844 1726882748.06684: stdout chunk (state=3): >>><<< 27844 1726882748.06697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.06702: handler run complete 27844 1726882748.06717: Evaluated conditional (False): False 27844 1726882748.06725: attempt loop complete, returning result 27844 1726882748.06739: variable 'item' from source: unknown 27844 1726882748.06807: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003940", "end": "2024-09-20 21:39:08.024358", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:39:08.020418" } 27844 1726882748.06927: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.06930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.06933: variable 'omit' from source: magic vars 27844 1726882748.07026: variable 'ansible_distribution_major_version' from source: facts 27844 1726882748.07030: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882748.07194: variable 'type' from source: set_fact 27844 1726882748.07197: variable 'state' from source: include params 27844 1726882748.07200: variable 'interface' from source: set_fact 27844 1726882748.07206: variable 'current_interfaces' from source: set_fact 27844 1726882748.07209: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882748.07216: variable 'omit' from source: magic vars 27844 1726882748.07489: variable 'omit' from source: magic vars 27844 1726882748.07491: variable 'item' from source: unknown 27844 1726882748.07493: variable 'item' from source: unknown 27844 1726882748.07495: variable 'omit' from source: magic vars 27844 1726882748.07498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882748.07500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882748.07502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882748.07504: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882748.07506: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.07508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.07779: Set connection var ansible_shell_type to sh 27844 1726882748.07782: Set connection var ansible_connection to ssh 27844 1726882748.07784: Set connection var ansible_pipelining to False 27844 1726882748.07786: Set connection var ansible_timeout to 10 27844 1726882748.07788: Set connection var ansible_shell_executable to /bin/sh 27844 1726882748.07790: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882748.07792: variable 'ansible_shell_executable' from source: unknown 27844 1726882748.07794: variable 'ansible_connection' from source: unknown 27844 1726882748.07796: variable 'ansible_module_compression' from source: unknown 27844 1726882748.07797: variable 'ansible_shell_type' from source: unknown 27844 1726882748.07799: variable 'ansible_shell_executable' from source: unknown 27844 1726882748.07801: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.07803: variable 'ansible_pipelining' from source: unknown 27844 1726882748.07805: variable 'ansible_timeout' from source: unknown 27844 1726882748.07806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.08078: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882748.08081: variable 'omit' from source: magic vars 27844 1726882748.08083: starting attempt loop 27844 1726882748.08085: running the handler 27844 1726882748.08087: _low_level_execute_command(): starting 27844 1726882748.08089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882748.08513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.08516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.08518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.08520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.08571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.08574: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.08577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.08579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.08581: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.08583: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.08624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.08627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.08629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.08631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.08633: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.08635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.08738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.08742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.08744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.08994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.10406: stdout chunk (state=3): >>>/root <<< 27844 1726882748.10571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.10575: stdout chunk (state=3): >>><<< 27844 1726882748.10583: stderr chunk (state=3): >>><<< 27844 1726882748.10598: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.10607: _low_level_execute_command(): starting 27844 1726882748.10613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592 `" && echo ansible-tmp-1726882748.1059828-28208-122326498947592="` echo /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592 `" ) && sleep 0' 27844 1726882748.11915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.12580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.12591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.12605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.12645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.12654: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.12661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.12678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.12686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.12691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.12698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.12707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.12718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.12725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.12731: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.12740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.12816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.12833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.12845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.12969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.14837: stdout chunk (state=3): >>>ansible-tmp-1726882748.1059828-28208-122326498947592=/root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592 <<< 27844 1726882748.15018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.15021: stdout chunk (state=3): >>><<< 27844 1726882748.15029: stderr chunk (state=3): >>><<< 27844 1726882748.15048: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882748.1059828-28208-122326498947592=/root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.15074: variable 'ansible_module_compression' from source: unknown 27844 1726882748.15114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882748.15132: variable 'ansible_facts' from source: unknown 27844 1726882748.15200: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/AnsiballZ_command.py 27844 1726882748.15654: Sending initial data 27844 1726882748.15657: Sent initial data (156 bytes) 27844 1726882748.17997: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.18105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.18118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.18133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.18173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.18229: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.18239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.18252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.18259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.18269: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.18276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.18330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.18344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.18352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.18358: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.18370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.18441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.18575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.18588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.18709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.20521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882748.20600: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882748.20709: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpfp7mlc3h /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/AnsiballZ_command.py <<< 27844 1726882748.20811: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882748.22602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.22845: stderr chunk (state=3): >>><<< 27844 1726882748.22848: stdout chunk (state=3): >>><<< 27844 1726882748.22850: done transferring module to remote 27844 1726882748.22852: _low_level_execute_command(): starting 27844 1726882748.22854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/ /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/AnsiballZ_command.py && sleep 0' 27844 1726882748.26416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.26567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.26584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.26603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.26642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.26667: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.26688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.26705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.26717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.26729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.26739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.26750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.26774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.26794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.26806: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.26819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.26959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.27015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.27035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.27223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.29032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.29035: stdout chunk (state=3): >>><<< 27844 1726882748.29038: stderr chunk (state=3): >>><<< 27844 1726882748.29071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.29075: _low_level_execute_command(): starting 27844 1726882748.29078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/AnsiballZ_command.py && sleep 0' 27844 1726882748.30707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.30713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.30806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.44478: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:39:08.436104", "end": "2024-09-20 21:39:08.442392", "delta": "0:00:00.006288", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882748.45604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882748.45654: stderr chunk (state=3): >>><<< 27844 1726882748.45658: stdout chunk (state=3): >>><<< 27844 1726882748.45679: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:39:08.436104", "end": "2024-09-20 21:39:08.442392", "delta": "0:00:00.006288", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882748.45700: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882748.45705: _low_level_execute_command(): starting 27844 1726882748.45709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882748.1059828-28208-122326498947592/ > /dev/null 2>&1 && sleep 0' 27844 1726882748.46176: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.46180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.46223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.46226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.46287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.46312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.46424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.48285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.48289: stdout chunk (state=3): >>><<< 27844 1726882748.48291: stderr chunk (state=3): >>><<< 27844 1726882748.48382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.48385: handler run complete 27844 1726882748.48388: Evaluated conditional (False): False 27844 1726882748.48390: attempt loop complete, returning result 27844 1726882748.48392: variable 'item' from source: unknown 27844 1726882748.48576: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.006288", "end": "2024-09-20 21:39:08.442392", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:39:08.436104" } 27844 1726882748.48678: dumping result to json 27844 1726882748.48681: done dumping result, returning 27844 1726882748.48684: done running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest0 [0e448fcc-3ce9-efa9-466a-00000000016e] 27844 1726882748.48686: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016e 27844 1726882748.48824: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016e 27844 1726882748.48827: WORKER PROCESS EXITING 27844 1726882748.48889: no more pending results, returning what we have 27844 1726882748.48892: results queue empty 27844 1726882748.48893: checking for any_errors_fatal 27844 1726882748.48898: done checking for any_errors_fatal 27844 1726882748.48899: checking for max_fail_percentage 27844 1726882748.48900: done checking for max_fail_percentage 27844 1726882748.48901: checking to see if all hosts have failed and the running result is not ok 27844 1726882748.48902: done checking to see if all hosts have failed 27844 1726882748.48902: getting the remaining hosts for this loop 27844 1726882748.48904: done getting the remaining hosts for this loop 27844 1726882748.48907: getting the next task for host managed_node1 27844 1726882748.48912: done getting next task for host managed_node1 27844 1726882748.48914: ^ task is: TASK: Set up veth as managed by NetworkManager 27844 1726882748.48917: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882748.48920: getting variables 27844 1726882748.48921: in VariableManager get_vars() 27844 1726882748.48955: Calling all_inventory to load vars for managed_node1 27844 1726882748.48957: Calling groups_inventory to load vars for managed_node1 27844 1726882748.48959: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882748.48970: Calling all_plugins_play to load vars for managed_node1 27844 1726882748.48972: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882748.48975: Calling groups_plugins_play to load vars for managed_node1 27844 1726882748.49193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882748.49414: done with get_vars() 27844 1726882748.49424: done getting variables 27844 1726882748.49489: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:08 -0400 (0:00:01.162) 0:00:07.571 ****** 27844 1726882748.49518: entering _queue_task() for managed_node1/command 27844 1726882748.49785: worker is 1 (out of 1 available) 27844 1726882748.49798: exiting _queue_task() for managed_node1/command 27844 1726882748.49812: done queuing things up, now waiting for results queue to drain 27844 1726882748.49813: waiting for pending results... 27844 1726882748.50087: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 27844 1726882748.50191: in run() - task 0e448fcc-3ce9-efa9-466a-00000000016f 27844 1726882748.50212: variable 'ansible_search_path' from source: unknown 27844 1726882748.50224: variable 'ansible_search_path' from source: unknown 27844 1726882748.50277: calling self._execute() 27844 1726882748.50376: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.50386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.50400: variable 'omit' from source: magic vars 27844 1726882748.50779: variable 'ansible_distribution_major_version' from source: facts 27844 1726882748.50801: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882748.50968: variable 'type' from source: set_fact 27844 1726882748.50980: variable 'state' from source: include params 27844 1726882748.50992: Evaluated conditional (type == 'veth' and state == 'present'): True 27844 1726882748.51003: variable 'omit' from source: magic vars 27844 1726882748.51045: variable 'omit' from source: magic vars 27844 1726882748.51141: variable 'interface' from source: set_fact 27844 1726882748.51155: variable 'omit' from source: magic vars 27844 1726882748.51204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882748.51239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882748.51255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882748.51269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882748.51281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882748.51302: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882748.51305: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.51313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.51393: Set connection var ansible_shell_type to sh 27844 1726882748.51396: Set connection var ansible_connection to ssh 27844 1726882748.51399: Set connection var ansible_pipelining to False 27844 1726882748.51405: Set connection var ansible_timeout to 10 27844 1726882748.51410: Set connection var ansible_shell_executable to /bin/sh 27844 1726882748.51415: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882748.51447: variable 'ansible_shell_executable' from source: unknown 27844 1726882748.51455: variable 'ansible_connection' from source: unknown 27844 1726882748.51461: variable 'ansible_module_compression' from source: unknown 27844 1726882748.51469: variable 'ansible_shell_type' from source: unknown 27844 1726882748.51475: variable 'ansible_shell_executable' from source: unknown 27844 1726882748.51477: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.51481: variable 'ansible_pipelining' from source: unknown 27844 1726882748.51483: variable 'ansible_timeout' from source: unknown 27844 1726882748.51488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.51621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882748.51626: variable 'omit' from source: magic vars 27844 1726882748.51632: starting attempt loop 27844 1726882748.51643: running the handler 27844 1726882748.51669: _low_level_execute_command(): starting 27844 1726882748.51682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882748.52413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.52429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.52448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.52471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.52514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.52526: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.52542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.52562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.52576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.52586: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.52598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.52612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.52631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.52644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.52660: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.52679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.52759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.52788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.52806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.52962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.54514: stdout chunk (state=3): >>>/root <<< 27844 1726882748.54618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.54706: stderr chunk (state=3): >>><<< 27844 1726882748.54718: stdout chunk (state=3): >>><<< 27844 1726882748.54772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.54776: _low_level_execute_command(): starting 27844 1726882748.54830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384 `" && echo ansible-tmp-1726882748.5475025-28267-151124388570384="` echo /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384 `" ) && sleep 0' 27844 1726882748.55362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.55379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.55401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.55407: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.55419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.55431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.55437: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.55559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.55648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.57520: stdout chunk (state=3): >>>ansible-tmp-1726882748.5475025-28267-151124388570384=/root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384 <<< 27844 1726882748.57719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.57721: stdout chunk (state=3): >>><<< 27844 1726882748.57724: stderr chunk (state=3): >>><<< 27844 1726882748.57982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882748.5475025-28267-151124388570384=/root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.57986: variable 'ansible_module_compression' from source: unknown 27844 1726882748.57988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882748.57990: variable 'ansible_facts' from source: unknown 27844 1726882748.57995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/AnsiballZ_command.py 27844 1726882748.58206: Sending initial data 27844 1726882748.58209: Sent initial data (156 bytes) 27844 1726882748.59658: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.59671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.59683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.59697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.59736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.59740: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.59749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.59775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.59778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.59780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.59782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.59802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.59805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.59807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.59815: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.59828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.59906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.59910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.59918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.60429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.62169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882748.62258: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882748.62353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp3kkatn8t /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/AnsiballZ_command.py <<< 27844 1726882748.62442: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882748.64142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.64287: stderr chunk (state=3): >>><<< 27844 1726882748.64290: stdout chunk (state=3): >>><<< 27844 1726882748.64293: done transferring module to remote 27844 1726882748.64295: _low_level_execute_command(): starting 27844 1726882748.64297: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/ /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/AnsiballZ_command.py && sleep 0' 27844 1726882748.64780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.64783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.64815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.64820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.64823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.64885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.64892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.64986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.66734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.66797: stderr chunk (state=3): >>><<< 27844 1726882748.66800: stdout chunk (state=3): >>><<< 27844 1726882748.66869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.66872: _low_level_execute_command(): starting 27844 1726882748.66875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/AnsiballZ_command.py && sleep 0' 27844 1726882748.67607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.67630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.67656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.67686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.67727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.67747: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.67775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.67802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.67825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.67843: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.67867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.67898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.67915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.67934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.67962: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.67982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.68071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.68105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.68123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.68249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.83360: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:39:08.811205", "end": "2024-09-20 21:39:08.831562", "delta": "0:00:00.020357", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882748.84614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882748.84638: stdout chunk (state=3): >>><<< 27844 1726882748.84645: stderr chunk (state=3): >>><<< 27844 1726882748.84704: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:39:08.811205", "end": "2024-09-20 21:39:08.831562", "delta": "0:00:00.020357", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882748.84753: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882748.84762: _low_level_execute_command(): starting 27844 1726882748.84766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882748.5475025-28267-151124388570384/ > /dev/null 2>&1 && sleep 0' 27844 1726882748.85561: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882748.85572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.85585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.85603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.85657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.85671: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882748.85701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.85719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882748.85729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882748.85781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882748.85808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882748.85822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882748.85882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882748.85891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882748.85897: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882748.85906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882748.85981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882748.86025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882748.86037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882748.86159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882748.87970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882748.88022: stderr chunk (state=3): >>><<< 27844 1726882748.88029: stdout chunk (state=3): >>><<< 27844 1726882748.88090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882748.88094: handler run complete 27844 1726882748.88097: Evaluated conditional (False): False 27844 1726882748.88099: attempt loop complete, returning result 27844 1726882748.88101: _execute() done 27844 1726882748.88103: dumping result to json 27844 1726882748.88105: done dumping result, returning 27844 1726882748.88107: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-efa9-466a-00000000016f] 27844 1726882748.88109: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016f 27844 1726882748.88190: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000016f 27844 1726882748.88194: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.020357", "end": "2024-09-20 21:39:08.831562", "rc": 0, "start": "2024-09-20 21:39:08.811205" } 27844 1726882748.88307: no more pending results, returning what we have 27844 1726882748.88311: results queue empty 27844 1726882748.88312: checking for any_errors_fatal 27844 1726882748.88321: done checking for any_errors_fatal 27844 1726882748.88322: checking for max_fail_percentage 27844 1726882748.88323: done checking for max_fail_percentage 27844 1726882748.88324: checking to see if all hosts have failed and the running result is not ok 27844 1726882748.88325: done checking to see if all hosts have failed 27844 1726882748.88325: getting the remaining hosts for this loop 27844 1726882748.88327: done getting the remaining hosts for this loop 27844 1726882748.88330: getting the next task for host managed_node1 27844 1726882748.88334: done getting next task for host managed_node1 27844 1726882748.88336: ^ task is: TASK: Delete veth interface {{ interface }} 27844 1726882748.88339: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882748.88342: getting variables 27844 1726882748.88343: in VariableManager get_vars() 27844 1726882748.88408: Calling all_inventory to load vars for managed_node1 27844 1726882748.88411: Calling groups_inventory to load vars for managed_node1 27844 1726882748.88417: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882748.88430: Calling all_plugins_play to load vars for managed_node1 27844 1726882748.88433: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882748.88437: Calling groups_plugins_play to load vars for managed_node1 27844 1726882748.88553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882748.88693: done with get_vars() 27844 1726882748.88701: done getting variables 27844 1726882748.88745: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882748.88875: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:08 -0400 (0:00:00.393) 0:00:07.965 ****** 27844 1726882748.88897: entering _queue_task() for managed_node1/command 27844 1726882748.89097: worker is 1 (out of 1 available) 27844 1726882748.89109: exiting _queue_task() for managed_node1/command 27844 1726882748.89122: done queuing things up, now waiting for results queue to drain 27844 1726882748.89123: waiting for pending results... 27844 1726882748.89298: running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 27844 1726882748.89388: in run() - task 0e448fcc-3ce9-efa9-466a-000000000170 27844 1726882748.89398: variable 'ansible_search_path' from source: unknown 27844 1726882748.89402: variable 'ansible_search_path' from source: unknown 27844 1726882748.89438: calling self._execute() 27844 1726882748.89543: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.89587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.89601: variable 'omit' from source: magic vars 27844 1726882748.90685: variable 'ansible_distribution_major_version' from source: facts 27844 1726882748.90702: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882748.90909: variable 'type' from source: set_fact 27844 1726882748.90919: variable 'state' from source: include params 27844 1726882748.90928: variable 'interface' from source: set_fact 27844 1726882748.90936: variable 'current_interfaces' from source: set_fact 27844 1726882748.90950: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27844 1726882748.90959: when evaluation is False, skipping this task 27844 1726882748.90966: _execute() done 27844 1726882748.90977: dumping result to json 27844 1726882748.90995: done dumping result, returning 27844 1726882748.91006: done running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000170] 27844 1726882748.91015: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000170 27844 1726882748.91117: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000170 27844 1726882748.91128: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882748.91391: no more pending results, returning what we have 27844 1726882748.91394: results queue empty 27844 1726882748.91397: checking for any_errors_fatal 27844 1726882748.91404: done checking for any_errors_fatal 27844 1726882748.91407: checking for max_fail_percentage 27844 1726882748.91408: done checking for max_fail_percentage 27844 1726882748.91409: checking to see if all hosts have failed and the running result is not ok 27844 1726882748.91410: done checking to see if all hosts have failed 27844 1726882748.91411: getting the remaining hosts for this loop 27844 1726882748.91415: done getting the remaining hosts for this loop 27844 1726882748.91418: getting the next task for host managed_node1 27844 1726882748.91423: done getting next task for host managed_node1 27844 1726882748.91427: ^ task is: TASK: Create dummy interface {{ interface }} 27844 1726882748.91430: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882748.91434: getting variables 27844 1726882748.91436: in VariableManager get_vars() 27844 1726882748.91483: Calling all_inventory to load vars for managed_node1 27844 1726882748.91485: Calling groups_inventory to load vars for managed_node1 27844 1726882748.91488: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882748.91607: Calling all_plugins_play to load vars for managed_node1 27844 1726882748.91611: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882748.91615: Calling groups_plugins_play to load vars for managed_node1 27844 1726882748.91847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882748.92046: done with get_vars() 27844 1726882748.92059: done getting variables 27844 1726882748.92119: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882748.92222: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:08 -0400 (0:00:00.033) 0:00:07.999 ****** 27844 1726882748.92250: entering _queue_task() for managed_node1/command 27844 1726882748.92901: worker is 1 (out of 1 available) 27844 1726882748.92920: exiting _queue_task() for managed_node1/command 27844 1726882748.92932: done queuing things up, now waiting for results queue to drain 27844 1726882748.92934: waiting for pending results... 27844 1726882748.93618: running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 27844 1726882748.93723: in run() - task 0e448fcc-3ce9-efa9-466a-000000000171 27844 1726882748.93741: variable 'ansible_search_path' from source: unknown 27844 1726882748.93748: variable 'ansible_search_path' from source: unknown 27844 1726882748.93797: calling self._execute() 27844 1726882748.93882: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.93895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.93907: variable 'omit' from source: magic vars 27844 1726882748.94256: variable 'ansible_distribution_major_version' from source: facts 27844 1726882748.94278: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882748.94489: variable 'type' from source: set_fact 27844 1726882748.94498: variable 'state' from source: include params 27844 1726882748.94506: variable 'interface' from source: set_fact 27844 1726882748.94514: variable 'current_interfaces' from source: set_fact 27844 1726882748.94524: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27844 1726882748.94535: when evaluation is False, skipping this task 27844 1726882748.94541: _execute() done 27844 1726882748.94547: dumping result to json 27844 1726882748.94554: done dumping result, returning 27844 1726882748.94562: done running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000171] 27844 1726882748.94576: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000171 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882748.94712: no more pending results, returning what we have 27844 1726882748.94716: results queue empty 27844 1726882748.94717: checking for any_errors_fatal 27844 1726882748.94723: done checking for any_errors_fatal 27844 1726882748.94724: checking for max_fail_percentage 27844 1726882748.94725: done checking for max_fail_percentage 27844 1726882748.94726: checking to see if all hosts have failed and the running result is not ok 27844 1726882748.94727: done checking to see if all hosts have failed 27844 1726882748.94728: getting the remaining hosts for this loop 27844 1726882748.94729: done getting the remaining hosts for this loop 27844 1726882748.94732: getting the next task for host managed_node1 27844 1726882748.94739: done getting next task for host managed_node1 27844 1726882748.94741: ^ task is: TASK: Delete dummy interface {{ interface }} 27844 1726882748.94744: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882748.94747: getting variables 27844 1726882748.94749: in VariableManager get_vars() 27844 1726882748.94793: Calling all_inventory to load vars for managed_node1 27844 1726882748.94796: Calling groups_inventory to load vars for managed_node1 27844 1726882748.94799: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882748.94812: Calling all_plugins_play to load vars for managed_node1 27844 1726882748.94814: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882748.94817: Calling groups_plugins_play to load vars for managed_node1 27844 1726882748.95005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882748.95223: done with get_vars() 27844 1726882748.95234: done getting variables 27844 1726882748.95323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882748.95550: variable 'interface' from source: set_fact 27844 1726882748.95697: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000171 27844 1726882748.95701: WORKER PROCESS EXITING TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:08 -0400 (0:00:00.034) 0:00:08.033 ****** 27844 1726882748.95713: entering _queue_task() for managed_node1/command 27844 1726882748.95935: worker is 1 (out of 1 available) 27844 1726882748.95947: exiting _queue_task() for managed_node1/command 27844 1726882748.95961: done queuing things up, now waiting for results queue to drain 27844 1726882748.95969: waiting for pending results... 27844 1726882748.96384: running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 27844 1726882748.96488: in run() - task 0e448fcc-3ce9-efa9-466a-000000000172 27844 1726882748.96536: variable 'ansible_search_path' from source: unknown 27844 1726882748.96543: variable 'ansible_search_path' from source: unknown 27844 1726882748.96585: calling self._execute() 27844 1726882748.96783: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882748.96878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882748.96893: variable 'omit' from source: magic vars 27844 1726882748.97606: variable 'ansible_distribution_major_version' from source: facts 27844 1726882748.97624: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882748.98312: variable 'type' from source: set_fact 27844 1726882748.98348: variable 'state' from source: include params 27844 1726882748.98361: variable 'interface' from source: set_fact 27844 1726882748.98376: variable 'current_interfaces' from source: set_fact 27844 1726882748.98389: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27844 1726882748.98396: when evaluation is False, skipping this task 27844 1726882748.98404: _execute() done 27844 1726882748.98411: dumping result to json 27844 1726882748.98419: done dumping result, returning 27844 1726882748.98428: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000172] 27844 1726882748.98439: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000172 27844 1726882748.98557: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000172 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882748.98612: no more pending results, returning what we have 27844 1726882748.98617: results queue empty 27844 1726882748.98618: checking for any_errors_fatal 27844 1726882748.98625: done checking for any_errors_fatal 27844 1726882748.98625: checking for max_fail_percentage 27844 1726882748.98627: done checking for max_fail_percentage 27844 1726882748.98628: checking to see if all hosts have failed and the running result is not ok 27844 1726882748.98629: done checking to see if all hosts have failed 27844 1726882748.98629: getting the remaining hosts for this loop 27844 1726882748.98631: done getting the remaining hosts for this loop 27844 1726882748.98635: getting the next task for host managed_node1 27844 1726882748.98642: done getting next task for host managed_node1 27844 1726882748.98644: ^ task is: TASK: Create tap interface {{ interface }} 27844 1726882748.98647: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882748.98652: getting variables 27844 1726882748.98654: in VariableManager get_vars() 27844 1726882748.98701: Calling all_inventory to load vars for managed_node1 27844 1726882748.98704: Calling groups_inventory to load vars for managed_node1 27844 1726882748.98706: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882748.98720: Calling all_plugins_play to load vars for managed_node1 27844 1726882748.98723: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882748.98727: Calling groups_plugins_play to load vars for managed_node1 27844 1726882748.98978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882748.99223: done with get_vars() 27844 1726882748.99236: done getting variables 27844 1726882748.99491: WORKER PROCESS EXITING 27844 1726882748.99556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882748.99775: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:08 -0400 (0:00:00.041) 0:00:08.074 ****** 27844 1726882748.99824: entering _queue_task() for managed_node1/command 27844 1726882749.00092: worker is 1 (out of 1 available) 27844 1726882749.00104: exiting _queue_task() for managed_node1/command 27844 1726882749.00115: done queuing things up, now waiting for results queue to drain 27844 1726882749.00117: waiting for pending results... 27844 1726882749.00358: running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 27844 1726882749.00470: in run() - task 0e448fcc-3ce9-efa9-466a-000000000173 27844 1726882749.00488: variable 'ansible_search_path' from source: unknown 27844 1726882749.00495: variable 'ansible_search_path' from source: unknown 27844 1726882749.00532: calling self._execute() 27844 1726882749.00618: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.00628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.00640: variable 'omit' from source: magic vars 27844 1726882749.00988: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.01005: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.01207: variable 'type' from source: set_fact 27844 1726882749.01218: variable 'state' from source: include params 27844 1726882749.01226: variable 'interface' from source: set_fact 27844 1726882749.01234: variable 'current_interfaces' from source: set_fact 27844 1726882749.01244: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27844 1726882749.01251: when evaluation is False, skipping this task 27844 1726882749.01256: _execute() done 27844 1726882749.01267: dumping result to json 27844 1726882749.01275: done dumping result, returning 27844 1726882749.01284: done running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000173] 27844 1726882749.01294: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000173 27844 1726882749.01389: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000173 27844 1726882749.01396: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882749.01476: no more pending results, returning what we have 27844 1726882749.01480: results queue empty 27844 1726882749.01481: checking for any_errors_fatal 27844 1726882749.01488: done checking for any_errors_fatal 27844 1726882749.01488: checking for max_fail_percentage 27844 1726882749.01490: done checking for max_fail_percentage 27844 1726882749.01491: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.01492: done checking to see if all hosts have failed 27844 1726882749.01493: getting the remaining hosts for this loop 27844 1726882749.01494: done getting the remaining hosts for this loop 27844 1726882749.01497: getting the next task for host managed_node1 27844 1726882749.01502: done getting next task for host managed_node1 27844 1726882749.01505: ^ task is: TASK: Delete tap interface {{ interface }} 27844 1726882749.01508: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.01512: getting variables 27844 1726882749.01514: in VariableManager get_vars() 27844 1726882749.01553: Calling all_inventory to load vars for managed_node1 27844 1726882749.01555: Calling groups_inventory to load vars for managed_node1 27844 1726882749.01557: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.01574: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.01577: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.01580: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.01757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.01968: done with get_vars() 27844 1726882749.01979: done getting variables 27844 1726882749.02051: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882749.02155: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:09 -0400 (0:00:00.023) 0:00:08.098 ****** 27844 1726882749.02193: entering _queue_task() for managed_node1/command 27844 1726882749.02542: worker is 1 (out of 1 available) 27844 1726882749.02553: exiting _queue_task() for managed_node1/command 27844 1726882749.02566: done queuing things up, now waiting for results queue to drain 27844 1726882749.02567: waiting for pending results... 27844 1726882749.03434: running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 27844 1726882749.03604: in run() - task 0e448fcc-3ce9-efa9-466a-000000000174 27844 1726882749.03647: variable 'ansible_search_path' from source: unknown 27844 1726882749.03655: variable 'ansible_search_path' from source: unknown 27844 1726882749.03799: calling self._execute() 27844 1726882749.03888: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.03906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.03919: variable 'omit' from source: magic vars 27844 1726882749.04643: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.04677: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.05120: variable 'type' from source: set_fact 27844 1726882749.05192: variable 'state' from source: include params 27844 1726882749.05207: variable 'interface' from source: set_fact 27844 1726882749.05219: variable 'current_interfaces' from source: set_fact 27844 1726882749.05231: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27844 1726882749.05238: when evaluation is False, skipping this task 27844 1726882749.05245: _execute() done 27844 1726882749.05253: dumping result to json 27844 1726882749.05306: done dumping result, returning 27844 1726882749.05331: done running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000174] 27844 1726882749.05341: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000174 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882749.05536: no more pending results, returning what we have 27844 1726882749.05540: results queue empty 27844 1726882749.05541: checking for any_errors_fatal 27844 1726882749.05548: done checking for any_errors_fatal 27844 1726882749.05549: checking for max_fail_percentage 27844 1726882749.05551: done checking for max_fail_percentage 27844 1726882749.05551: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.05552: done checking to see if all hosts have failed 27844 1726882749.05553: getting the remaining hosts for this loop 27844 1726882749.05555: done getting the remaining hosts for this loop 27844 1726882749.05558: getting the next task for host managed_node1 27844 1726882749.05571: done getting next task for host managed_node1 27844 1726882749.05575: ^ task is: TASK: Assert device is present 27844 1726882749.05577: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.05581: getting variables 27844 1726882749.05583: in VariableManager get_vars() 27844 1726882749.05622: Calling all_inventory to load vars for managed_node1 27844 1726882749.05625: Calling groups_inventory to load vars for managed_node1 27844 1726882749.05627: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.05642: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.05645: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.05649: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.05862: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000174 27844 1726882749.05870: WORKER PROCESS EXITING 27844 1726882749.05884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.06105: done with get_vars() 27844 1726882749.06116: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Friday 20 September 2024 21:39:09 -0400 (0:00:00.050) 0:00:08.149 ****** 27844 1726882749.07283: entering _queue_task() for managed_node1/include_tasks 27844 1726882749.07505: worker is 1 (out of 1 available) 27844 1726882749.07518: exiting _queue_task() for managed_node1/include_tasks 27844 1726882749.07530: done queuing things up, now waiting for results queue to drain 27844 1726882749.07532: waiting for pending results... 27844 1726882749.08203: running TaskExecutor() for managed_node1/TASK: Assert device is present 27844 1726882749.08504: in run() - task 0e448fcc-3ce9-efa9-466a-00000000000e 27844 1726882749.08517: variable 'ansible_search_path' from source: unknown 27844 1726882749.08554: calling self._execute() 27844 1726882749.08749: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.08755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.08763: variable 'omit' from source: magic vars 27844 1726882749.09749: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.09773: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.09806: _execute() done 27844 1726882749.09914: dumping result to json 27844 1726882749.09923: done dumping result, returning 27844 1726882749.09934: done running TaskExecutor() for managed_node1/TASK: Assert device is present [0e448fcc-3ce9-efa9-466a-00000000000e] 27844 1726882749.09943: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000e 27844 1726882749.10066: no more pending results, returning what we have 27844 1726882749.10072: in VariableManager get_vars() 27844 1726882749.10119: Calling all_inventory to load vars for managed_node1 27844 1726882749.10122: Calling groups_inventory to load vars for managed_node1 27844 1726882749.10124: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.10139: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.10142: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.10145: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.10339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.10535: done with get_vars() 27844 1726882749.10543: variable 'ansible_search_path' from source: unknown 27844 1726882749.10557: we have included files to process 27844 1726882749.10558: generating all_blocks data 27844 1726882749.10560: done generating all_blocks data 27844 1726882749.10570: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882749.10572: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882749.10575: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882749.10946: in VariableManager get_vars() 27844 1726882749.10972: done with get_vars() 27844 1726882749.11198: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000e 27844 1726882749.11202: WORKER PROCESS EXITING 27844 1726882749.11310: done processing included file 27844 1726882749.11312: iterating over new_blocks loaded from include file 27844 1726882749.11313: in VariableManager get_vars() 27844 1726882749.11329: done with get_vars() 27844 1726882749.11331: filtering new block on tags 27844 1726882749.11348: done filtering new block on tags 27844 1726882749.11351: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 27844 1726882749.11355: extending task lists for all hosts with included blocks 27844 1726882749.12048: done extending task lists 27844 1726882749.12049: done processing included files 27844 1726882749.12050: results queue empty 27844 1726882749.12051: checking for any_errors_fatal 27844 1726882749.12054: done checking for any_errors_fatal 27844 1726882749.12055: checking for max_fail_percentage 27844 1726882749.12056: done checking for max_fail_percentage 27844 1726882749.12056: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.12057: done checking to see if all hosts have failed 27844 1726882749.12058: getting the remaining hosts for this loop 27844 1726882749.12059: done getting the remaining hosts for this loop 27844 1726882749.12062: getting the next task for host managed_node1 27844 1726882749.12173: done getting next task for host managed_node1 27844 1726882749.12176: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27844 1726882749.12179: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.12181: getting variables 27844 1726882749.12182: in VariableManager get_vars() 27844 1726882749.12194: Calling all_inventory to load vars for managed_node1 27844 1726882749.12196: Calling groups_inventory to load vars for managed_node1 27844 1726882749.12198: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.12203: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.12205: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.12208: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.12340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.12527: done with get_vars() 27844 1726882749.12536: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:39:09 -0400 (0:00:00.053) 0:00:08.202 ****** 27844 1726882749.12604: entering _queue_task() for managed_node1/include_tasks 27844 1726882749.12834: worker is 1 (out of 1 available) 27844 1726882749.12845: exiting _queue_task() for managed_node1/include_tasks 27844 1726882749.12855: done queuing things up, now waiting for results queue to drain 27844 1726882749.12857: waiting for pending results... 27844 1726882749.13104: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 27844 1726882749.13204: in run() - task 0e448fcc-3ce9-efa9-466a-000000000214 27844 1726882749.13221: variable 'ansible_search_path' from source: unknown 27844 1726882749.13229: variable 'ansible_search_path' from source: unknown 27844 1726882749.13269: calling self._execute() 27844 1726882749.13352: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.13363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.13380: variable 'omit' from source: magic vars 27844 1726882749.13735: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.13753: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.13766: _execute() done 27844 1726882749.13775: dumping result to json 27844 1726882749.13782: done dumping result, returning 27844 1726882749.13791: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-efa9-466a-000000000214] 27844 1726882749.13800: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000214 27844 1726882749.13912: no more pending results, returning what we have 27844 1726882749.13918: in VariableManager get_vars() 27844 1726882749.13969: Calling all_inventory to load vars for managed_node1 27844 1726882749.13972: Calling groups_inventory to load vars for managed_node1 27844 1726882749.13975: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.13988: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.13991: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.13994: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.14200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.14390: done with get_vars() 27844 1726882749.14398: variable 'ansible_search_path' from source: unknown 27844 1726882749.14399: variable 'ansible_search_path' from source: unknown 27844 1726882749.14433: we have included files to process 27844 1726882749.14434: generating all_blocks data 27844 1726882749.14436: done generating all_blocks data 27844 1726882749.14437: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882749.14438: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882749.14441: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882749.14693: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000214 27844 1726882749.14697: WORKER PROCESS EXITING 27844 1726882749.14800: done processing included file 27844 1726882749.14802: iterating over new_blocks loaded from include file 27844 1726882749.14804: in VariableManager get_vars() 27844 1726882749.14820: done with get_vars() 27844 1726882749.14822: filtering new block on tags 27844 1726882749.14838: done filtering new block on tags 27844 1726882749.14840: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 27844 1726882749.14845: extending task lists for all hosts with included blocks 27844 1726882749.15024: done extending task lists 27844 1726882749.15025: done processing included files 27844 1726882749.15026: results queue empty 27844 1726882749.15027: checking for any_errors_fatal 27844 1726882749.15030: done checking for any_errors_fatal 27844 1726882749.15030: checking for max_fail_percentage 27844 1726882749.15032: done checking for max_fail_percentage 27844 1726882749.15032: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.15033: done checking to see if all hosts have failed 27844 1726882749.15034: getting the remaining hosts for this loop 27844 1726882749.15035: done getting the remaining hosts for this loop 27844 1726882749.15038: getting the next task for host managed_node1 27844 1726882749.15041: done getting next task for host managed_node1 27844 1726882749.15043: ^ task is: TASK: Get stat for interface {{ interface }} 27844 1726882749.15046: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.15048: getting variables 27844 1726882749.15049: in VariableManager get_vars() 27844 1726882749.15061: Calling all_inventory to load vars for managed_node1 27844 1726882749.15063: Calling groups_inventory to load vars for managed_node1 27844 1726882749.15067: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.15073: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.15076: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.15079: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.15213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.15403: done with get_vars() 27844 1726882749.15410: done getting variables 27844 1726882749.15552: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:09 -0400 (0:00:00.029) 0:00:08.232 ****** 27844 1726882749.15583: entering _queue_task() for managed_node1/stat 27844 1726882749.15788: worker is 1 (out of 1 available) 27844 1726882749.15800: exiting _queue_task() for managed_node1/stat 27844 1726882749.15809: done queuing things up, now waiting for results queue to drain 27844 1726882749.15811: waiting for pending results... 27844 1726882749.16041: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 27844 1726882749.16148: in run() - task 0e448fcc-3ce9-efa9-466a-000000000267 27844 1726882749.16173: variable 'ansible_search_path' from source: unknown 27844 1726882749.16180: variable 'ansible_search_path' from source: unknown 27844 1726882749.16218: calling self._execute() 27844 1726882749.16301: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.16311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.16325: variable 'omit' from source: magic vars 27844 1726882749.16724: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.16741: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.16751: variable 'omit' from source: magic vars 27844 1726882749.16795: variable 'omit' from source: magic vars 27844 1726882749.16895: variable 'interface' from source: set_fact 27844 1726882749.16929: variable 'omit' from source: magic vars 27844 1726882749.16973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882749.17008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882749.17036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882749.17058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.17077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.17110: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882749.17120: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.17131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.17230: Set connection var ansible_shell_type to sh 27844 1726882749.17242: Set connection var ansible_connection to ssh 27844 1726882749.17252: Set connection var ansible_pipelining to False 27844 1726882749.17262: Set connection var ansible_timeout to 10 27844 1726882749.17277: Set connection var ansible_shell_executable to /bin/sh 27844 1726882749.17287: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882749.17318: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.17326: variable 'ansible_connection' from source: unknown 27844 1726882749.17332: variable 'ansible_module_compression' from source: unknown 27844 1726882749.17336: variable 'ansible_shell_type' from source: unknown 27844 1726882749.17343: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.17350: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.17355: variable 'ansible_pipelining' from source: unknown 27844 1726882749.17359: variable 'ansible_timeout' from source: unknown 27844 1726882749.17366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.17534: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882749.17548: variable 'omit' from source: magic vars 27844 1726882749.17560: starting attempt loop 27844 1726882749.17571: running the handler 27844 1726882749.17592: _low_level_execute_command(): starting 27844 1726882749.17605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882749.18343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.18359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.18379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.18399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.18446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.18459: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882749.18482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.18501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882749.18514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882749.18526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882749.18542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.18556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.18579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.18593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.18605: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882749.18620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.18701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.18725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.18743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.18884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.20565: stdout chunk (state=3): >>>/root <<< 27844 1726882749.20754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.20758: stdout chunk (state=3): >>><<< 27844 1726882749.20760: stderr chunk (state=3): >>><<< 27844 1726882749.20872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.20876: _low_level_execute_command(): starting 27844 1726882749.20879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934 `" && echo ansible-tmp-1726882749.2078345-28309-263179822363934="` echo /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934 `" ) && sleep 0' 27844 1726882749.21432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.21445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.21459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.21482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.21525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.21538: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882749.21552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.21573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882749.21586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882749.21598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882749.21611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.21625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.21646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.21659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.21676: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882749.21690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.21771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.21792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.21809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.21937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.23847: stdout chunk (state=3): >>>ansible-tmp-1726882749.2078345-28309-263179822363934=/root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934 <<< 27844 1726882749.24034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.24037: stdout chunk (state=3): >>><<< 27844 1726882749.24040: stderr chunk (state=3): >>><<< 27844 1726882749.24373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882749.2078345-28309-263179822363934=/root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.24377: variable 'ansible_module_compression' from source: unknown 27844 1726882749.24380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882749.24382: variable 'ansible_facts' from source: unknown 27844 1726882749.24384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/AnsiballZ_stat.py 27844 1726882749.24686: Sending initial data 27844 1726882749.24690: Sent initial data (153 bytes) 27844 1726882749.27079: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.27214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.27231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.27250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.27294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.27310: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882749.27327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.27343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882749.27356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882749.27368: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882749.27380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.27392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.27406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.27420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.27432: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882749.27446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.27526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.27579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.27597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.27730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.29494: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882749.29590: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882749.29689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpe5ib8ni_ /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/AnsiballZ_stat.py <<< 27844 1726882749.29781: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882749.31278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.31282: stderr chunk (state=3): >>><<< 27844 1726882749.31286: stdout chunk (state=3): >>><<< 27844 1726882749.31311: done transferring module to remote 27844 1726882749.31324: _low_level_execute_command(): starting 27844 1726882749.31330: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/ /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/AnsiballZ_stat.py && sleep 0' 27844 1726882749.32752: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.32755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.32801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882749.32813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.32817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882749.32819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.32977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.33158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.33162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.33259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.35030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.35099: stderr chunk (state=3): >>><<< 27844 1726882749.35102: stdout chunk (state=3): >>><<< 27844 1726882749.35196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.35199: _low_level_execute_command(): starting 27844 1726882749.35202: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/AnsiballZ_stat.py && sleep 0' 27844 1726882749.36628: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.36729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.36743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.36760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.36803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.36814: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882749.36835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.36852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882749.36866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882749.36878: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882749.36889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.36902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.36916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.36929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.36944: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882749.36957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.37033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.37170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.37186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.37322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.50467: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28490, "dev": 21, "nlink": 1, "atime": 1726882747.6511722, "mtime": 1726882747.6511722, "ctime": 1726882747.6511722, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882749.51431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882749.51450: stderr chunk (state=3): >>><<< 27844 1726882749.51453: stdout chunk (state=3): >>><<< 27844 1726882749.51473: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28490, "dev": 21, "nlink": 1, "atime": 1726882747.6511722, "mtime": 1726882747.6511722, "ctime": 1726882747.6511722, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882749.51513: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882749.51521: _low_level_execute_command(): starting 27844 1726882749.51527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882749.2078345-28309-263179822363934/ > /dev/null 2>&1 && sleep 0' 27844 1726882749.51954: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.51962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.51983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.51986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.52028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.52032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.52034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.52077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.52089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.52193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.54004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.54067: stderr chunk (state=3): >>><<< 27844 1726882749.54070: stdout chunk (state=3): >>><<< 27844 1726882749.54083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.54139: handler run complete 27844 1726882749.54155: attempt loop complete, returning result 27844 1726882749.54158: _execute() done 27844 1726882749.54161: dumping result to json 27844 1726882749.54228: done dumping result, returning 27844 1726882749.54231: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000267] 27844 1726882749.54234: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000267 27844 1726882749.54368: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000267 27844 1726882749.54371: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882747.6511722, "block_size": 4096, "blocks": 0, "ctime": 1726882747.6511722, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28490, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882747.6511722, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27844 1726882749.54635: no more pending results, returning what we have 27844 1726882749.54640: results queue empty 27844 1726882749.54641: checking for any_errors_fatal 27844 1726882749.54642: done checking for any_errors_fatal 27844 1726882749.54643: checking for max_fail_percentage 27844 1726882749.54644: done checking for max_fail_percentage 27844 1726882749.54645: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.54646: done checking to see if all hosts have failed 27844 1726882749.54647: getting the remaining hosts for this loop 27844 1726882749.54648: done getting the remaining hosts for this loop 27844 1726882749.54651: getting the next task for host managed_node1 27844 1726882749.54659: done getting next task for host managed_node1 27844 1726882749.54662: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27844 1726882749.54672: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.54677: getting variables 27844 1726882749.54678: in VariableManager get_vars() 27844 1726882749.54720: Calling all_inventory to load vars for managed_node1 27844 1726882749.54723: Calling groups_inventory to load vars for managed_node1 27844 1726882749.54725: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.54741: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.54745: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.54749: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.54945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.55149: done with get_vars() 27844 1726882749.55159: done getting variables 27844 1726882749.55282: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 27844 1726882749.55392: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:39:09 -0400 (0:00:00.400) 0:00:08.632 ****** 27844 1726882749.55601: entering _queue_task() for managed_node1/assert 27844 1726882749.55604: Creating lock for assert 27844 1726882749.55933: worker is 1 (out of 1 available) 27844 1726882749.55950: exiting _queue_task() for managed_node1/assert 27844 1726882749.55961: done queuing things up, now waiting for results queue to drain 27844 1726882749.55962: waiting for pending results... 27844 1726882749.56239: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' 27844 1726882749.56361: in run() - task 0e448fcc-3ce9-efa9-466a-000000000215 27844 1726882749.56390: variable 'ansible_search_path' from source: unknown 27844 1726882749.56399: variable 'ansible_search_path' from source: unknown 27844 1726882749.56454: calling self._execute() 27844 1726882749.56562: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.56578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.56592: variable 'omit' from source: magic vars 27844 1726882749.57023: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.57049: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.57070: variable 'omit' from source: magic vars 27844 1726882749.57115: variable 'omit' from source: magic vars 27844 1726882749.57247: variable 'interface' from source: set_fact 27844 1726882749.57288: variable 'omit' from source: magic vars 27844 1726882749.57345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882749.57404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882749.57433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882749.57462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.57489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.57537: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882749.57554: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.57566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.57691: Set connection var ansible_shell_type to sh 27844 1726882749.57705: Set connection var ansible_connection to ssh 27844 1726882749.57720: Set connection var ansible_pipelining to False 27844 1726882749.57737: Set connection var ansible_timeout to 10 27844 1726882749.57750: Set connection var ansible_shell_executable to /bin/sh 27844 1726882749.57760: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882749.57792: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.57801: variable 'ansible_connection' from source: unknown 27844 1726882749.57810: variable 'ansible_module_compression' from source: unknown 27844 1726882749.57831: variable 'ansible_shell_type' from source: unknown 27844 1726882749.57847: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.57859: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.57879: variable 'ansible_pipelining' from source: unknown 27844 1726882749.57890: variable 'ansible_timeout' from source: unknown 27844 1726882749.57906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.58108: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882749.58128: variable 'omit' from source: magic vars 27844 1726882749.58148: starting attempt loop 27844 1726882749.58164: running the handler 27844 1726882749.58341: variable 'interface_stat' from source: set_fact 27844 1726882749.58378: Evaluated conditional (interface_stat.stat.exists): True 27844 1726882749.58394: handler run complete 27844 1726882749.58421: attempt loop complete, returning result 27844 1726882749.58429: _execute() done 27844 1726882749.58444: dumping result to json 27844 1726882749.58453: done dumping result, returning 27844 1726882749.58472: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest0' [0e448fcc-3ce9-efa9-466a-000000000215] 27844 1726882749.58487: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000215 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882749.58652: no more pending results, returning what we have 27844 1726882749.58657: results queue empty 27844 1726882749.58658: checking for any_errors_fatal 27844 1726882749.58666: done checking for any_errors_fatal 27844 1726882749.58667: checking for max_fail_percentage 27844 1726882749.58671: done checking for max_fail_percentage 27844 1726882749.58672: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.58675: done checking to see if all hosts have failed 27844 1726882749.58676: getting the remaining hosts for this loop 27844 1726882749.58677: done getting the remaining hosts for this loop 27844 1726882749.58684: getting the next task for host managed_node1 27844 1726882749.58696: done getting next task for host managed_node1 27844 1726882749.58701: ^ task is: TASK: Set interface1 27844 1726882749.58704: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.58709: getting variables 27844 1726882749.58712: in VariableManager get_vars() 27844 1726882749.58766: Calling all_inventory to load vars for managed_node1 27844 1726882749.58769: Calling groups_inventory to load vars for managed_node1 27844 1726882749.58772: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.58785: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.58788: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.58791: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.58951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.59196: done with get_vars() 27844 1726882749.59205: done getting variables 27844 1726882749.59239: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000215 27844 1726882749.59242: WORKER PROCESS EXITING 27844 1726882749.59286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Friday 20 September 2024 21:39:09 -0400 (0:00:00.037) 0:00:08.669 ****** 27844 1726882749.59316: entering _queue_task() for managed_node1/set_fact 27844 1726882749.59554: worker is 1 (out of 1 available) 27844 1726882749.59571: exiting _queue_task() for managed_node1/set_fact 27844 1726882749.59587: done queuing things up, now waiting for results queue to drain 27844 1726882749.59591: waiting for pending results... 27844 1726882749.59885: running TaskExecutor() for managed_node1/TASK: Set interface1 27844 1726882749.59984: in run() - task 0e448fcc-3ce9-efa9-466a-00000000000f 27844 1726882749.60004: variable 'ansible_search_path' from source: unknown 27844 1726882749.60057: calling self._execute() 27844 1726882749.60156: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.60172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.60196: variable 'omit' from source: magic vars 27844 1726882749.60610: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.60631: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.60648: variable 'omit' from source: magic vars 27844 1726882749.60688: variable 'omit' from source: magic vars 27844 1726882749.60733: variable 'interface1' from source: play vars 27844 1726882749.60830: variable 'interface1' from source: play vars 27844 1726882749.60857: variable 'omit' from source: magic vars 27844 1726882749.60907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882749.60945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882749.60971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882749.60992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.61008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.61041: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882749.61049: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.61057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.61175: Set connection var ansible_shell_type to sh 27844 1726882749.61184: Set connection var ansible_connection to ssh 27844 1726882749.61196: Set connection var ansible_pipelining to False 27844 1726882749.61212: Set connection var ansible_timeout to 10 27844 1726882749.61231: Set connection var ansible_shell_executable to /bin/sh 27844 1726882749.61250: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882749.61287: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.61300: variable 'ansible_connection' from source: unknown 27844 1726882749.61307: variable 'ansible_module_compression' from source: unknown 27844 1726882749.61315: variable 'ansible_shell_type' from source: unknown 27844 1726882749.61322: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.61331: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.61344: variable 'ansible_pipelining' from source: unknown 27844 1726882749.61353: variable 'ansible_timeout' from source: unknown 27844 1726882749.61361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.61511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882749.61531: variable 'omit' from source: magic vars 27844 1726882749.61542: starting attempt loop 27844 1726882749.61549: running the handler 27844 1726882749.61569: handler run complete 27844 1726882749.61587: attempt loop complete, returning result 27844 1726882749.61608: _execute() done 27844 1726882749.61620: dumping result to json 27844 1726882749.61635: done dumping result, returning 27844 1726882749.61647: done running TaskExecutor() for managed_node1/TASK: Set interface1 [0e448fcc-3ce9-efa9-466a-00000000000f] 27844 1726882749.61656: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000f 27844 1726882749.61776: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000000f ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 27844 1726882749.61847: no more pending results, returning what we have 27844 1726882749.61850: results queue empty 27844 1726882749.61852: checking for any_errors_fatal 27844 1726882749.61857: done checking for any_errors_fatal 27844 1726882749.61858: checking for max_fail_percentage 27844 1726882749.61860: done checking for max_fail_percentage 27844 1726882749.61860: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.61861: done checking to see if all hosts have failed 27844 1726882749.61862: getting the remaining hosts for this loop 27844 1726882749.61865: done getting the remaining hosts for this loop 27844 1726882749.61868: getting the next task for host managed_node1 27844 1726882749.61874: done getting next task for host managed_node1 27844 1726882749.61877: ^ task is: TASK: Show interfaces 27844 1726882749.61879: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.61881: getting variables 27844 1726882749.61883: in VariableManager get_vars() 27844 1726882749.61920: Calling all_inventory to load vars for managed_node1 27844 1726882749.61924: Calling groups_inventory to load vars for managed_node1 27844 1726882749.61927: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.61942: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.61946: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.61952: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.62194: WORKER PROCESS EXITING 27844 1726882749.62223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.62433: done with get_vars() 27844 1726882749.62439: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Friday 20 September 2024 21:39:09 -0400 (0:00:00.031) 0:00:08.701 ****** 27844 1726882749.62502: entering _queue_task() for managed_node1/include_tasks 27844 1726882749.62665: worker is 1 (out of 1 available) 27844 1726882749.62679: exiting _queue_task() for managed_node1/include_tasks 27844 1726882749.62690: done queuing things up, now waiting for results queue to drain 27844 1726882749.62692: waiting for pending results... 27844 1726882749.62845: running TaskExecutor() for managed_node1/TASK: Show interfaces 27844 1726882749.62904: in run() - task 0e448fcc-3ce9-efa9-466a-000000000010 27844 1726882749.62917: variable 'ansible_search_path' from source: unknown 27844 1726882749.62949: calling self._execute() 27844 1726882749.63018: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.63027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.63036: variable 'omit' from source: magic vars 27844 1726882749.63290: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.63303: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.63309: _execute() done 27844 1726882749.63314: dumping result to json 27844 1726882749.63319: done dumping result, returning 27844 1726882749.63325: done running TaskExecutor() for managed_node1/TASK: Show interfaces [0e448fcc-3ce9-efa9-466a-000000000010] 27844 1726882749.63333: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000010 27844 1726882749.63411: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000010 27844 1726882749.63414: WORKER PROCESS EXITING 27844 1726882749.63482: no more pending results, returning what we have 27844 1726882749.63486: in VariableManager get_vars() 27844 1726882749.63521: Calling all_inventory to load vars for managed_node1 27844 1726882749.63523: Calling groups_inventory to load vars for managed_node1 27844 1726882749.63524: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.63532: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.63534: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.63536: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.63829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.63937: done with get_vars() 27844 1726882749.63942: variable 'ansible_search_path' from source: unknown 27844 1726882749.63958: we have included files to process 27844 1726882749.63959: generating all_blocks data 27844 1726882749.63961: done generating all_blocks data 27844 1726882749.63978: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882749.63980: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882749.63986: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882749.64052: in VariableManager get_vars() 27844 1726882749.64093: done with get_vars() 27844 1726882749.64192: done processing included file 27844 1726882749.64194: iterating over new_blocks loaded from include file 27844 1726882749.64195: in VariableManager get_vars() 27844 1726882749.64212: done with get_vars() 27844 1726882749.64213: filtering new block on tags 27844 1726882749.64224: done filtering new block on tags 27844 1726882749.64226: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 27844 1726882749.64228: extending task lists for all hosts with included blocks 27844 1726882749.64930: done extending task lists 27844 1726882749.64931: done processing included files 27844 1726882749.64932: results queue empty 27844 1726882749.64933: checking for any_errors_fatal 27844 1726882749.64935: done checking for any_errors_fatal 27844 1726882749.64936: checking for max_fail_percentage 27844 1726882749.64937: done checking for max_fail_percentage 27844 1726882749.64938: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.64938: done checking to see if all hosts have failed 27844 1726882749.64939: getting the remaining hosts for this loop 27844 1726882749.64940: done getting the remaining hosts for this loop 27844 1726882749.64943: getting the next task for host managed_node1 27844 1726882749.64946: done getting next task for host managed_node1 27844 1726882749.64948: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27844 1726882749.64950: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.64952: getting variables 27844 1726882749.64953: in VariableManager get_vars() 27844 1726882749.64969: Calling all_inventory to load vars for managed_node1 27844 1726882749.64971: Calling groups_inventory to load vars for managed_node1 27844 1726882749.64973: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.64980: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.64982: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.64985: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.65148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.65349: done with get_vars() 27844 1726882749.65360: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:09 -0400 (0:00:00.029) 0:00:08.730 ****** 27844 1726882749.65433: entering _queue_task() for managed_node1/include_tasks 27844 1726882749.65651: worker is 1 (out of 1 available) 27844 1726882749.65663: exiting _queue_task() for managed_node1/include_tasks 27844 1726882749.65678: done queuing things up, now waiting for results queue to drain 27844 1726882749.65679: waiting for pending results... 27844 1726882749.65910: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 27844 1726882749.65991: in run() - task 0e448fcc-3ce9-efa9-466a-000000000282 27844 1726882749.66012: variable 'ansible_search_path' from source: unknown 27844 1726882749.66024: variable 'ansible_search_path' from source: unknown 27844 1726882749.66073: calling self._execute() 27844 1726882749.66173: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.66188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.66208: variable 'omit' from source: magic vars 27844 1726882749.66643: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.66659: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.66683: _execute() done 27844 1726882749.66699: dumping result to json 27844 1726882749.66706: done dumping result, returning 27844 1726882749.66733: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-efa9-466a-000000000282] 27844 1726882749.66753: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000282 27844 1726882749.66862: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000282 27844 1726882749.66877: WORKER PROCESS EXITING 27844 1726882749.66913: no more pending results, returning what we have 27844 1726882749.66921: in VariableManager get_vars() 27844 1726882749.66968: Calling all_inventory to load vars for managed_node1 27844 1726882749.66972: Calling groups_inventory to load vars for managed_node1 27844 1726882749.66974: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.66987: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.66990: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.66993: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.67217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.67440: done with get_vars() 27844 1726882749.67447: variable 'ansible_search_path' from source: unknown 27844 1726882749.67448: variable 'ansible_search_path' from source: unknown 27844 1726882749.67485: we have included files to process 27844 1726882749.67486: generating all_blocks data 27844 1726882749.67487: done generating all_blocks data 27844 1726882749.67489: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882749.67490: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882749.67491: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882749.67756: done processing included file 27844 1726882749.67759: iterating over new_blocks loaded from include file 27844 1726882749.67762: in VariableManager get_vars() 27844 1726882749.67784: done with get_vars() 27844 1726882749.67787: filtering new block on tags 27844 1726882749.67807: done filtering new block on tags 27844 1726882749.67809: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 27844 1726882749.67815: extending task lists for all hosts with included blocks 27844 1726882749.67941: done extending task lists 27844 1726882749.67945: done processing included files 27844 1726882749.67946: results queue empty 27844 1726882749.67947: checking for any_errors_fatal 27844 1726882749.67949: done checking for any_errors_fatal 27844 1726882749.67950: checking for max_fail_percentage 27844 1726882749.67951: done checking for max_fail_percentage 27844 1726882749.67952: checking to see if all hosts have failed and the running result is not ok 27844 1726882749.67953: done checking to see if all hosts have failed 27844 1726882749.67953: getting the remaining hosts for this loop 27844 1726882749.67954: done getting the remaining hosts for this loop 27844 1726882749.67957: getting the next task for host managed_node1 27844 1726882749.67961: done getting next task for host managed_node1 27844 1726882749.67963: ^ task is: TASK: Gather current interface info 27844 1726882749.67970: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882749.67975: getting variables 27844 1726882749.67976: in VariableManager get_vars() 27844 1726882749.67995: Calling all_inventory to load vars for managed_node1 27844 1726882749.68000: Calling groups_inventory to load vars for managed_node1 27844 1726882749.68002: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882749.68008: Calling all_plugins_play to load vars for managed_node1 27844 1726882749.68013: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882749.68017: Calling groups_plugins_play to load vars for managed_node1 27844 1726882749.68199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882749.68425: done with get_vars() 27844 1726882749.68436: done getting variables 27844 1726882749.68474: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:09 -0400 (0:00:00.030) 0:00:08.761 ****** 27844 1726882749.68505: entering _queue_task() for managed_node1/command 27844 1726882749.68730: worker is 1 (out of 1 available) 27844 1726882749.68740: exiting _queue_task() for managed_node1/command 27844 1726882749.68752: done queuing things up, now waiting for results queue to drain 27844 1726882749.68753: waiting for pending results... 27844 1726882749.69059: running TaskExecutor() for managed_node1/TASK: Gather current interface info 27844 1726882749.69205: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002e0 27844 1726882749.69241: variable 'ansible_search_path' from source: unknown 27844 1726882749.69251: variable 'ansible_search_path' from source: unknown 27844 1726882749.69329: calling self._execute() 27844 1726882749.69448: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.69468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.69486: variable 'omit' from source: magic vars 27844 1726882749.69976: variable 'ansible_distribution_major_version' from source: facts 27844 1726882749.69987: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882749.69993: variable 'omit' from source: magic vars 27844 1726882749.70020: variable 'omit' from source: magic vars 27844 1726882749.70050: variable 'omit' from source: magic vars 27844 1726882749.70086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882749.70116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882749.70131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882749.70148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.70157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882749.70192: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882749.70209: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.70222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.70360: Set connection var ansible_shell_type to sh 27844 1726882749.70374: Set connection var ansible_connection to ssh 27844 1726882749.70393: Set connection var ansible_pipelining to False 27844 1726882749.70412: Set connection var ansible_timeout to 10 27844 1726882749.70427: Set connection var ansible_shell_executable to /bin/sh 27844 1726882749.70447: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882749.70478: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.70482: variable 'ansible_connection' from source: unknown 27844 1726882749.70485: variable 'ansible_module_compression' from source: unknown 27844 1726882749.70488: variable 'ansible_shell_type' from source: unknown 27844 1726882749.70490: variable 'ansible_shell_executable' from source: unknown 27844 1726882749.70492: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882749.70495: variable 'ansible_pipelining' from source: unknown 27844 1726882749.70497: variable 'ansible_timeout' from source: unknown 27844 1726882749.70499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882749.70612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882749.70621: variable 'omit' from source: magic vars 27844 1726882749.70626: starting attempt loop 27844 1726882749.70628: running the handler 27844 1726882749.70642: _low_level_execute_command(): starting 27844 1726882749.70649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882749.71292: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.71296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.71304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.71332: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.71335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882749.71337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882749.71340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.71384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.71401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.71404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.71509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.73178: stdout chunk (state=3): >>>/root <<< 27844 1726882749.73288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.73354: stderr chunk (state=3): >>><<< 27844 1726882749.73362: stdout chunk (state=3): >>><<< 27844 1726882749.73384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.73393: _low_level_execute_command(): starting 27844 1726882749.73400: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142 `" && echo ansible-tmp-1726882749.7338128-28334-139672238442142="` echo /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142 `" ) && sleep 0' 27844 1726882749.73940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.73960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.73968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.73974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.73999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882749.74030: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882749.74033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.74035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.74038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882749.74039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.74092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.74095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.74097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.74204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.76069: stdout chunk (state=3): >>>ansible-tmp-1726882749.7338128-28334-139672238442142=/root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142 <<< 27844 1726882749.76183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.76242: stderr chunk (state=3): >>><<< 27844 1726882749.76246: stdout chunk (state=3): >>><<< 27844 1726882749.76250: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882749.7338128-28334-139672238442142=/root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.76282: variable 'ansible_module_compression' from source: unknown 27844 1726882749.76324: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882749.76352: variable 'ansible_facts' from source: unknown 27844 1726882749.76410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/AnsiballZ_command.py 27844 1726882749.76508: Sending initial data 27844 1726882749.76518: Sent initial data (156 bytes) 27844 1726882749.77355: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.77358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.77382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.77427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.77441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.77444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.77511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.77515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.77602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.79347: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 27844 1726882749.79351: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882749.79422: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882749.79515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpze8x9vay /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/AnsiballZ_command.py <<< 27844 1726882749.79603: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882749.80952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.81726: stderr chunk (state=3): >>><<< 27844 1726882749.81729: stdout chunk (state=3): >>><<< 27844 1726882749.81744: done transferring module to remote 27844 1726882749.81755: _low_level_execute_command(): starting 27844 1726882749.81758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/ /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/AnsiballZ_command.py && sleep 0' 27844 1726882749.82375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.82488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.82525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.82554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.82619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.84331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882749.84397: stderr chunk (state=3): >>><<< 27844 1726882749.84407: stdout chunk (state=3): >>><<< 27844 1726882749.84469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882749.84472: _low_level_execute_command(): starting 27844 1726882749.84475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/AnsiballZ_command.py && sleep 0' 27844 1726882749.84990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882749.85004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882749.85029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.85032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882749.85067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.85071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882749.85073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882749.85134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882749.85149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882749.85160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882749.85277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882749.98530: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:09.980508", "end": "2024-09-20 21:39:09.983767", "delta": "0:00:00.003259", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882749.99686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882749.99780: stderr chunk (state=3): >>><<< 27844 1726882749.99783: stdout chunk (state=3): >>><<< 27844 1726882749.99876: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:09.980508", "end": "2024-09-20 21:39:09.983767", "delta": "0:00:00.003259", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882749.99880: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882749.99883: _low_level_execute_command(): starting 27844 1726882749.99885: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882749.7338128-28334-139672238442142/ > /dev/null 2>&1 && sleep 0' 27844 1726882750.01843: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.01966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.01983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.02002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.02044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.02062: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.02081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.02099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.02117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.02129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.02142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.02156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.02178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.02191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.02203: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.02217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.02347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.02416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.02433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.02560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.04381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.04450: stderr chunk (state=3): >>><<< 27844 1726882750.04453: stdout chunk (state=3): >>><<< 27844 1726882750.04575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.04578: handler run complete 27844 1726882750.04581: Evaluated conditional (False): False 27844 1726882750.04583: attempt loop complete, returning result 27844 1726882750.04585: _execute() done 27844 1726882750.04587: dumping result to json 27844 1726882750.04589: done dumping result, returning 27844 1726882750.04591: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-efa9-466a-0000000002e0] 27844 1726882750.04593: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002e0 27844 1726882750.04761: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002e0 27844 1726882750.04766: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003259", "end": "2024-09-20 21:39:09.983767", "rc": 0, "start": "2024-09-20 21:39:09.980508" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 27844 1726882750.04871: no more pending results, returning what we have 27844 1726882750.04875: results queue empty 27844 1726882750.04876: checking for any_errors_fatal 27844 1726882750.04878: done checking for any_errors_fatal 27844 1726882750.04878: checking for max_fail_percentage 27844 1726882750.04881: done checking for max_fail_percentage 27844 1726882750.04881: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.04883: done checking to see if all hosts have failed 27844 1726882750.04883: getting the remaining hosts for this loop 27844 1726882750.04885: done getting the remaining hosts for this loop 27844 1726882750.04889: getting the next task for host managed_node1 27844 1726882750.04896: done getting next task for host managed_node1 27844 1726882750.04899: ^ task is: TASK: Set current_interfaces 27844 1726882750.04904: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.04907: getting variables 27844 1726882750.04909: in VariableManager get_vars() 27844 1726882750.04951: Calling all_inventory to load vars for managed_node1 27844 1726882750.04954: Calling groups_inventory to load vars for managed_node1 27844 1726882750.04957: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.04973: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.04976: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.04980: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.05340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.05561: done with get_vars() 27844 1726882750.05574: done getting variables 27844 1726882750.05702: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:10 -0400 (0:00:00.372) 0:00:09.134 ****** 27844 1726882750.05792: entering _queue_task() for managed_node1/set_fact 27844 1726882750.06317: worker is 1 (out of 1 available) 27844 1726882750.06329: exiting _queue_task() for managed_node1/set_fact 27844 1726882750.06342: done queuing things up, now waiting for results queue to drain 27844 1726882750.06344: waiting for pending results... 27844 1726882750.06933: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 27844 1726882750.07138: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002e1 27844 1726882750.08108: variable 'ansible_search_path' from source: unknown 27844 1726882750.08119: variable 'ansible_search_path' from source: unknown 27844 1726882750.08162: calling self._execute() 27844 1726882750.08441: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.08582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.08598: variable 'omit' from source: magic vars 27844 1726882750.09119: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.09136: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.09145: variable 'omit' from source: magic vars 27844 1726882750.09197: variable 'omit' from source: magic vars 27844 1726882750.09304: variable '_current_interfaces' from source: set_fact 27844 1726882750.09379: variable 'omit' from source: magic vars 27844 1726882750.09429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.09467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.09491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.09514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.09534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.09571: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.09580: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.09588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.09696: Set connection var ansible_shell_type to sh 27844 1726882750.09702: Set connection var ansible_connection to ssh 27844 1726882750.09710: Set connection var ansible_pipelining to False 27844 1726882750.09720: Set connection var ansible_timeout to 10 27844 1726882750.09730: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.09740: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.09776: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.09785: variable 'ansible_connection' from source: unknown 27844 1726882750.09792: variable 'ansible_module_compression' from source: unknown 27844 1726882750.09798: variable 'ansible_shell_type' from source: unknown 27844 1726882750.09804: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.09810: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.09817: variable 'ansible_pipelining' from source: unknown 27844 1726882750.09823: variable 'ansible_timeout' from source: unknown 27844 1726882750.09835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.09987: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.10003: variable 'omit' from source: magic vars 27844 1726882750.10013: starting attempt loop 27844 1726882750.10020: running the handler 27844 1726882750.10035: handler run complete 27844 1726882750.10055: attempt loop complete, returning result 27844 1726882750.10061: _execute() done 27844 1726882750.10074: dumping result to json 27844 1726882750.10082: done dumping result, returning 27844 1726882750.10092: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-efa9-466a-0000000002e1] 27844 1726882750.10099: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002e1 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 27844 1726882750.10245: no more pending results, returning what we have 27844 1726882750.10248: results queue empty 27844 1726882750.10249: checking for any_errors_fatal 27844 1726882750.10258: done checking for any_errors_fatal 27844 1726882750.10259: checking for max_fail_percentage 27844 1726882750.10261: done checking for max_fail_percentage 27844 1726882750.10262: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.10267: done checking to see if all hosts have failed 27844 1726882750.10268: getting the remaining hosts for this loop 27844 1726882750.10270: done getting the remaining hosts for this loop 27844 1726882750.10274: getting the next task for host managed_node1 27844 1726882750.10282: done getting next task for host managed_node1 27844 1726882750.10286: ^ task is: TASK: Show current_interfaces 27844 1726882750.10289: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.10292: getting variables 27844 1726882750.10294: in VariableManager get_vars() 27844 1726882750.10333: Calling all_inventory to load vars for managed_node1 27844 1726882750.10336: Calling groups_inventory to load vars for managed_node1 27844 1726882750.10338: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.10349: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.10352: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.10355: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.10583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.10789: done with get_vars() 27844 1726882750.10802: done getting variables 27844 1726882750.10853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:10 -0400 (0:00:00.050) 0:00:09.185 ****** 27844 1726882750.10890: entering _queue_task() for managed_node1/debug 27844 1726882750.11015: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002e1 27844 1726882750.11018: WORKER PROCESS EXITING 27844 1726882750.11362: worker is 1 (out of 1 available) 27844 1726882750.11377: exiting _queue_task() for managed_node1/debug 27844 1726882750.11388: done queuing things up, now waiting for results queue to drain 27844 1726882750.11389: waiting for pending results... 27844 1726882750.11647: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 27844 1726882750.11752: in run() - task 0e448fcc-3ce9-efa9-466a-000000000283 27844 1726882750.11779: variable 'ansible_search_path' from source: unknown 27844 1726882750.11788: variable 'ansible_search_path' from source: unknown 27844 1726882750.11829: calling self._execute() 27844 1726882750.11926: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.11939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.11954: variable 'omit' from source: magic vars 27844 1726882750.13340: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.13375: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.13482: variable 'omit' from source: magic vars 27844 1726882750.13521: variable 'omit' from source: magic vars 27844 1726882750.13742: variable 'current_interfaces' from source: set_fact 27844 1726882750.13917: variable 'omit' from source: magic vars 27844 1726882750.13969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.14020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.14050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.14089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.14149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.14197: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.14251: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.14260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.14579: Set connection var ansible_shell_type to sh 27844 1726882750.14612: Set connection var ansible_connection to ssh 27844 1726882750.14627: Set connection var ansible_pipelining to False 27844 1726882750.14638: Set connection var ansible_timeout to 10 27844 1726882750.14648: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.14658: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.14697: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.14707: variable 'ansible_connection' from source: unknown 27844 1726882750.14715: variable 'ansible_module_compression' from source: unknown 27844 1726882750.14721: variable 'ansible_shell_type' from source: unknown 27844 1726882750.14728: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.14734: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.14742: variable 'ansible_pipelining' from source: unknown 27844 1726882750.14749: variable 'ansible_timeout' from source: unknown 27844 1726882750.14756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.15148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.15170: variable 'omit' from source: magic vars 27844 1726882750.15182: starting attempt loop 27844 1726882750.15189: running the handler 27844 1726882750.15244: handler run complete 27844 1726882750.15456: attempt loop complete, returning result 27844 1726882750.15469: _execute() done 27844 1726882750.15478: dumping result to json 27844 1726882750.15485: done dumping result, returning 27844 1726882750.15495: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-efa9-466a-000000000283] 27844 1726882750.15503: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000283 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 27844 1726882750.15636: no more pending results, returning what we have 27844 1726882750.15639: results queue empty 27844 1726882750.15640: checking for any_errors_fatal 27844 1726882750.15644: done checking for any_errors_fatal 27844 1726882750.15645: checking for max_fail_percentage 27844 1726882750.15647: done checking for max_fail_percentage 27844 1726882750.15647: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.15648: done checking to see if all hosts have failed 27844 1726882750.15649: getting the remaining hosts for this loop 27844 1726882750.15651: done getting the remaining hosts for this loop 27844 1726882750.15654: getting the next task for host managed_node1 27844 1726882750.15660: done getting next task for host managed_node1 27844 1726882750.15668: ^ task is: TASK: Manage test interface 27844 1726882750.15671: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.15674: getting variables 27844 1726882750.15676: in VariableManager get_vars() 27844 1726882750.15718: Calling all_inventory to load vars for managed_node1 27844 1726882750.15721: Calling groups_inventory to load vars for managed_node1 27844 1726882750.15723: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.15735: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.15738: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.15741: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.15921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.16141: done with get_vars() 27844 1726882750.16153: done getting variables 27844 1726882750.16386: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000283 27844 1726882750.16389: WORKER PROCESS EXITING TASK [Manage test interface] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Friday 20 September 2024 21:39:10 -0400 (0:00:00.055) 0:00:09.240 ****** 27844 1726882750.16413: entering _queue_task() for managed_node1/include_tasks 27844 1726882750.17053: worker is 1 (out of 1 available) 27844 1726882750.17271: exiting _queue_task() for managed_node1/include_tasks 27844 1726882750.17281: done queuing things up, now waiting for results queue to drain 27844 1726882750.17283: waiting for pending results... 27844 1726882750.17303: running TaskExecutor() for managed_node1/TASK: Manage test interface 27844 1726882750.17413: in run() - task 0e448fcc-3ce9-efa9-466a-000000000011 27844 1726882750.17438: variable 'ansible_search_path' from source: unknown 27844 1726882750.17482: calling self._execute() 27844 1726882750.17574: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.17587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.17600: variable 'omit' from source: magic vars 27844 1726882750.18243: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.18259: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.18273: _execute() done 27844 1726882750.18296: dumping result to json 27844 1726882750.18406: done dumping result, returning 27844 1726882750.18438: done running TaskExecutor() for managed_node1/TASK: Manage test interface [0e448fcc-3ce9-efa9-466a-000000000011] 27844 1726882750.18448: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000011 27844 1726882750.18789: no more pending results, returning what we have 27844 1726882750.18794: in VariableManager get_vars() 27844 1726882750.18834: Calling all_inventory to load vars for managed_node1 27844 1726882750.18836: Calling groups_inventory to load vars for managed_node1 27844 1726882750.18838: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.18850: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.18853: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.18856: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.19110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.19314: done with get_vars() 27844 1726882750.19322: variable 'ansible_search_path' from source: unknown 27844 1726882750.19335: we have included files to process 27844 1726882750.19336: generating all_blocks data 27844 1726882750.19338: done generating all_blocks data 27844 1726882750.19343: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882750.19344: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882750.19347: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27844 1726882750.19869: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000011 27844 1726882750.19872: WORKER PROCESS EXITING 27844 1726882750.20395: in VariableManager get_vars() 27844 1726882750.20422: done with get_vars() 27844 1726882750.21354: done processing included file 27844 1726882750.21356: iterating over new_blocks loaded from include file 27844 1726882750.21357: in VariableManager get_vars() 27844 1726882750.21385: done with get_vars() 27844 1726882750.21388: filtering new block on tags 27844 1726882750.21421: done filtering new block on tags 27844 1726882750.21423: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 27844 1726882750.21428: extending task lists for all hosts with included blocks 27844 1726882750.22398: done extending task lists 27844 1726882750.22400: done processing included files 27844 1726882750.22400: results queue empty 27844 1726882750.22401: checking for any_errors_fatal 27844 1726882750.22404: done checking for any_errors_fatal 27844 1726882750.22404: checking for max_fail_percentage 27844 1726882750.22405: done checking for max_fail_percentage 27844 1726882750.22406: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.22407: done checking to see if all hosts have failed 27844 1726882750.22408: getting the remaining hosts for this loop 27844 1726882750.22409: done getting the remaining hosts for this loop 27844 1726882750.22411: getting the next task for host managed_node1 27844 1726882750.22414: done getting next task for host managed_node1 27844 1726882750.22416: ^ task is: TASK: Ensure state in ["present", "absent"] 27844 1726882750.22419: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.22421: getting variables 27844 1726882750.22422: in VariableManager get_vars() 27844 1726882750.22434: Calling all_inventory to load vars for managed_node1 27844 1726882750.22436: Calling groups_inventory to load vars for managed_node1 27844 1726882750.22438: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.22443: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.22446: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.22449: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.22593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.22795: done with get_vars() 27844 1726882750.22805: done getting variables 27844 1726882750.22841: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:10 -0400 (0:00:00.064) 0:00:09.305 ****** 27844 1726882750.22869: entering _queue_task() for managed_node1/fail 27844 1726882750.23250: worker is 1 (out of 1 available) 27844 1726882750.23263: exiting _queue_task() for managed_node1/fail 27844 1726882750.23279: done queuing things up, now waiting for results queue to drain 27844 1726882750.23281: waiting for pending results... 27844 1726882750.23523: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 27844 1726882750.23624: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002fc 27844 1726882750.23645: variable 'ansible_search_path' from source: unknown 27844 1726882750.23652: variable 'ansible_search_path' from source: unknown 27844 1726882750.23696: calling self._execute() 27844 1726882750.23800: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.23812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.23825: variable 'omit' from source: magic vars 27844 1726882750.24250: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.24278: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.24518: variable 'state' from source: include params 27844 1726882750.24539: Evaluated conditional (state not in ["present", "absent"]): False 27844 1726882750.24542: when evaluation is False, skipping this task 27844 1726882750.24547: _execute() done 27844 1726882750.24550: dumping result to json 27844 1726882750.24552: done dumping result, returning 27844 1726882750.24554: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-efa9-466a-0000000002fc] 27844 1726882750.24556: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fc 27844 1726882750.24653: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fc 27844 1726882750.24656: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27844 1726882750.24720: no more pending results, returning what we have 27844 1726882750.24723: results queue empty 27844 1726882750.24724: checking for any_errors_fatal 27844 1726882750.24726: done checking for any_errors_fatal 27844 1726882750.24727: checking for max_fail_percentage 27844 1726882750.24728: done checking for max_fail_percentage 27844 1726882750.24729: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.24729: done checking to see if all hosts have failed 27844 1726882750.24730: getting the remaining hosts for this loop 27844 1726882750.24731: done getting the remaining hosts for this loop 27844 1726882750.24734: getting the next task for host managed_node1 27844 1726882750.24738: done getting next task for host managed_node1 27844 1726882750.24740: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27844 1726882750.24743: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.24746: getting variables 27844 1726882750.24747: in VariableManager get_vars() 27844 1726882750.24788: Calling all_inventory to load vars for managed_node1 27844 1726882750.24794: Calling groups_inventory to load vars for managed_node1 27844 1726882750.24796: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.24803: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.24805: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.24807: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.24934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.25056: done with get_vars() 27844 1726882750.25062: done getting variables 27844 1726882750.25101: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:10 -0400 (0:00:00.022) 0:00:09.327 ****** 27844 1726882750.25121: entering _queue_task() for managed_node1/fail 27844 1726882750.25277: worker is 1 (out of 1 available) 27844 1726882750.25290: exiting _queue_task() for managed_node1/fail 27844 1726882750.25301: done queuing things up, now waiting for results queue to drain 27844 1726882750.25303: waiting for pending results... 27844 1726882750.25453: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 27844 1726882750.25514: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002fd 27844 1726882750.25526: variable 'ansible_search_path' from source: unknown 27844 1726882750.25529: variable 'ansible_search_path' from source: unknown 27844 1726882750.25559: calling self._execute() 27844 1726882750.25621: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.25624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.25633: variable 'omit' from source: magic vars 27844 1726882750.25971: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.25988: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.26140: variable 'type' from source: set_fact 27844 1726882750.26152: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27844 1726882750.26159: when evaluation is False, skipping this task 27844 1726882750.26168: _execute() done 27844 1726882750.26176: dumping result to json 27844 1726882750.26184: done dumping result, returning 27844 1726882750.26193: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-efa9-466a-0000000002fd] 27844 1726882750.26202: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fd 27844 1726882750.26309: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fd 27844 1726882750.26316: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27844 1726882750.26413: no more pending results, returning what we have 27844 1726882750.26417: results queue empty 27844 1726882750.26418: checking for any_errors_fatal 27844 1726882750.26424: done checking for any_errors_fatal 27844 1726882750.26425: checking for max_fail_percentage 27844 1726882750.26426: done checking for max_fail_percentage 27844 1726882750.26427: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.26428: done checking to see if all hosts have failed 27844 1726882750.26429: getting the remaining hosts for this loop 27844 1726882750.26430: done getting the remaining hosts for this loop 27844 1726882750.26433: getting the next task for host managed_node1 27844 1726882750.26440: done getting next task for host managed_node1 27844 1726882750.26442: ^ task is: TASK: Include the task 'show_interfaces.yml' 27844 1726882750.26446: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.26450: getting variables 27844 1726882750.26452: in VariableManager get_vars() 27844 1726882750.26521: Calling all_inventory to load vars for managed_node1 27844 1726882750.26524: Calling groups_inventory to load vars for managed_node1 27844 1726882750.26527: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.26539: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.26542: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.26545: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.27315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.27521: done with get_vars() 27844 1726882750.27530: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:10 -0400 (0:00:00.024) 0:00:09.352 ****** 27844 1726882750.27618: entering _queue_task() for managed_node1/include_tasks 27844 1726882750.27808: worker is 1 (out of 1 available) 27844 1726882750.27820: exiting _queue_task() for managed_node1/include_tasks 27844 1726882750.27832: done queuing things up, now waiting for results queue to drain 27844 1726882750.27833: waiting for pending results... 27844 1726882750.28011: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 27844 1726882750.28117: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002fe 27844 1726882750.28145: variable 'ansible_search_path' from source: unknown 27844 1726882750.28174: variable 'ansible_search_path' from source: unknown 27844 1726882750.28213: calling self._execute() 27844 1726882750.28302: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.28314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.28327: variable 'omit' from source: magic vars 27844 1726882750.28735: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.28752: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.28762: _execute() done 27844 1726882750.28775: dumping result to json 27844 1726882750.28785: done dumping result, returning 27844 1726882750.28795: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-efa9-466a-0000000002fe] 27844 1726882750.28811: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fe 27844 1726882750.28951: no more pending results, returning what we have 27844 1726882750.28957: in VariableManager get_vars() 27844 1726882750.29005: Calling all_inventory to load vars for managed_node1 27844 1726882750.29008: Calling groups_inventory to load vars for managed_node1 27844 1726882750.29010: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.29025: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.29028: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.29031: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.29258: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002fe 27844 1726882750.29262: WORKER PROCESS EXITING 27844 1726882750.29283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.29423: done with get_vars() 27844 1726882750.29428: variable 'ansible_search_path' from source: unknown 27844 1726882750.29428: variable 'ansible_search_path' from source: unknown 27844 1726882750.29449: we have included files to process 27844 1726882750.29450: generating all_blocks data 27844 1726882750.29451: done generating all_blocks data 27844 1726882750.29454: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882750.29455: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882750.29456: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27844 1726882750.29531: in VariableManager get_vars() 27844 1726882750.29545: done with get_vars() 27844 1726882750.29619: done processing included file 27844 1726882750.29620: iterating over new_blocks loaded from include file 27844 1726882750.29621: in VariableManager get_vars() 27844 1726882750.29632: done with get_vars() 27844 1726882750.29633: filtering new block on tags 27844 1726882750.29643: done filtering new block on tags 27844 1726882750.29644: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 27844 1726882750.29648: extending task lists for all hosts with included blocks 27844 1726882750.29897: done extending task lists 27844 1726882750.29898: done processing included files 27844 1726882750.29898: results queue empty 27844 1726882750.29899: checking for any_errors_fatal 27844 1726882750.29901: done checking for any_errors_fatal 27844 1726882750.29901: checking for max_fail_percentage 27844 1726882750.29902: done checking for max_fail_percentage 27844 1726882750.29902: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.29903: done checking to see if all hosts have failed 27844 1726882750.29903: getting the remaining hosts for this loop 27844 1726882750.29904: done getting the remaining hosts for this loop 27844 1726882750.29906: getting the next task for host managed_node1 27844 1726882750.29908: done getting next task for host managed_node1 27844 1726882750.29909: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27844 1726882750.29911: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.29913: getting variables 27844 1726882750.29913: in VariableManager get_vars() 27844 1726882750.29922: Calling all_inventory to load vars for managed_node1 27844 1726882750.29924: Calling groups_inventory to load vars for managed_node1 27844 1726882750.29925: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.29929: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.29931: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.29933: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.30055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.30234: done with get_vars() 27844 1726882750.30243: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:10 -0400 (0:00:00.026) 0:00:09.379 ****** 27844 1726882750.30311: entering _queue_task() for managed_node1/include_tasks 27844 1726882750.30504: worker is 1 (out of 1 available) 27844 1726882750.30516: exiting _queue_task() for managed_node1/include_tasks 27844 1726882750.30529: done queuing things up, now waiting for results queue to drain 27844 1726882750.30531: waiting for pending results... 27844 1726882750.30783: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 27844 1726882750.30877: in run() - task 0e448fcc-3ce9-efa9-466a-000000000374 27844 1726882750.30888: variable 'ansible_search_path' from source: unknown 27844 1726882750.30892: variable 'ansible_search_path' from source: unknown 27844 1726882750.30928: calling self._execute() 27844 1726882750.31010: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.31013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.31024: variable 'omit' from source: magic vars 27844 1726882750.31549: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.31560: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.31570: _execute() done 27844 1726882750.31573: dumping result to json 27844 1726882750.31577: done dumping result, returning 27844 1726882750.31586: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-efa9-466a-000000000374] 27844 1726882750.31626: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000374 27844 1726882750.31714: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000374 27844 1726882750.31718: WORKER PROCESS EXITING 27844 1726882750.31748: no more pending results, returning what we have 27844 1726882750.31753: in VariableManager get_vars() 27844 1726882750.31799: Calling all_inventory to load vars for managed_node1 27844 1726882750.31802: Calling groups_inventory to load vars for managed_node1 27844 1726882750.31805: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.31818: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.31821: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.31824: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.32013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.32202: done with get_vars() 27844 1726882750.32209: variable 'ansible_search_path' from source: unknown 27844 1726882750.32210: variable 'ansible_search_path' from source: unknown 27844 1726882750.32270: we have included files to process 27844 1726882750.32271: generating all_blocks data 27844 1726882750.32274: done generating all_blocks data 27844 1726882750.32275: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882750.32276: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882750.32279: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27844 1726882750.32672: done processing included file 27844 1726882750.32675: iterating over new_blocks loaded from include file 27844 1726882750.32676: in VariableManager get_vars() 27844 1726882750.32695: done with get_vars() 27844 1726882750.32696: filtering new block on tags 27844 1726882750.32719: done filtering new block on tags 27844 1726882750.32721: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 27844 1726882750.32726: extending task lists for all hosts with included blocks 27844 1726882750.32994: done extending task lists 27844 1726882750.32996: done processing included files 27844 1726882750.32997: results queue empty 27844 1726882750.32997: checking for any_errors_fatal 27844 1726882750.33000: done checking for any_errors_fatal 27844 1726882750.33012: checking for max_fail_percentage 27844 1726882750.33019: done checking for max_fail_percentage 27844 1726882750.33020: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.33021: done checking to see if all hosts have failed 27844 1726882750.33022: getting the remaining hosts for this loop 27844 1726882750.33028: done getting the remaining hosts for this loop 27844 1726882750.33031: getting the next task for host managed_node1 27844 1726882750.33035: done getting next task for host managed_node1 27844 1726882750.33037: ^ task is: TASK: Gather current interface info 27844 1726882750.33041: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.33043: getting variables 27844 1726882750.33043: in VariableManager get_vars() 27844 1726882750.33067: Calling all_inventory to load vars for managed_node1 27844 1726882750.33069: Calling groups_inventory to load vars for managed_node1 27844 1726882750.33073: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.33078: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.33081: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.33084: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.33236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.33362: done with get_vars() 27844 1726882750.33370: done getting variables 27844 1726882750.33398: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:10 -0400 (0:00:00.031) 0:00:09.410 ****** 27844 1726882750.33418: entering _queue_task() for managed_node1/command 27844 1726882750.33586: worker is 1 (out of 1 available) 27844 1726882750.33597: exiting _queue_task() for managed_node1/command 27844 1726882750.33609: done queuing things up, now waiting for results queue to drain 27844 1726882750.33611: waiting for pending results... 27844 1726882750.33762: running TaskExecutor() for managed_node1/TASK: Gather current interface info 27844 1726882750.33835: in run() - task 0e448fcc-3ce9-efa9-466a-0000000003ab 27844 1726882750.33845: variable 'ansible_search_path' from source: unknown 27844 1726882750.33848: variable 'ansible_search_path' from source: unknown 27844 1726882750.33879: calling self._execute() 27844 1726882750.33942: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.33947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.33953: variable 'omit' from source: magic vars 27844 1726882750.34196: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.34206: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.34211: variable 'omit' from source: magic vars 27844 1726882750.34244: variable 'omit' from source: magic vars 27844 1726882750.34273: variable 'omit' from source: magic vars 27844 1726882750.34304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.34329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.34343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.34357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.34370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.34393: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.34396: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.34398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.34461: Set connection var ansible_shell_type to sh 27844 1726882750.34471: Set connection var ansible_connection to ssh 27844 1726882750.34474: Set connection var ansible_pipelining to False 27844 1726882750.34477: Set connection var ansible_timeout to 10 27844 1726882750.34484: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.34486: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.34506: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.34508: variable 'ansible_connection' from source: unknown 27844 1726882750.34511: variable 'ansible_module_compression' from source: unknown 27844 1726882750.34513: variable 'ansible_shell_type' from source: unknown 27844 1726882750.34516: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.34518: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.34522: variable 'ansible_pipelining' from source: unknown 27844 1726882750.34524: variable 'ansible_timeout' from source: unknown 27844 1726882750.34528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.34624: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.34632: variable 'omit' from source: magic vars 27844 1726882750.34637: starting attempt loop 27844 1726882750.34639: running the handler 27844 1726882750.34651: _low_level_execute_command(): starting 27844 1726882750.34658: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882750.35239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.35260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.35341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.35345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.35425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.37088: stdout chunk (state=3): >>>/root <<< 27844 1726882750.37201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.37256: stderr chunk (state=3): >>><<< 27844 1726882750.37273: stdout chunk (state=3): >>><<< 27844 1726882750.37355: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.37358: _low_level_execute_command(): starting 27844 1726882750.37361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969 `" && echo ansible-tmp-1726882750.3731554-28381-51048196473969="` echo /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969 `" ) && sleep 0' 27844 1726882750.37914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.37928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.37933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.37946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.37984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.37991: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.38000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.38013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.38022: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.38030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.38039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.38045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.38056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.38070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.38073: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.38084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.38162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.38172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.38179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.38346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.40167: stdout chunk (state=3): >>>ansible-tmp-1726882750.3731554-28381-51048196473969=/root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969 <<< 27844 1726882750.40287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.40334: stderr chunk (state=3): >>><<< 27844 1726882750.40337: stdout chunk (state=3): >>><<< 27844 1726882750.40349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882750.3731554-28381-51048196473969=/root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.40375: variable 'ansible_module_compression' from source: unknown 27844 1726882750.40416: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882750.40444: variable 'ansible_facts' from source: unknown 27844 1726882750.40505: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/AnsiballZ_command.py 27844 1726882750.40600: Sending initial data 27844 1726882750.40603: Sent initial data (155 bytes) 27844 1726882750.41278: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.41297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.41309: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.41322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.41340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.41352: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.41369: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.41384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.41398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.41413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.41426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.41437: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.41451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.41528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.41546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.41562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.41693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.43410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882750.43499: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882750.43592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpeldjwbpf /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/AnsiballZ_command.py <<< 27844 1726882750.43683: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882750.44693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.44778: stderr chunk (state=3): >>><<< 27844 1726882750.44782: stdout chunk (state=3): >>><<< 27844 1726882750.44795: done transferring module to remote 27844 1726882750.44803: _low_level_execute_command(): starting 27844 1726882750.44807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/ /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/AnsiballZ_command.py && sleep 0' 27844 1726882750.45215: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.45220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.45262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.45268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.45275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.45327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.45330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.45436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.47192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.47233: stderr chunk (state=3): >>><<< 27844 1726882750.47236: stdout chunk (state=3): >>><<< 27844 1726882750.47247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.47250: _low_level_execute_command(): starting 27844 1726882750.47254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/AnsiballZ_command.py && sleep 0' 27844 1726882750.47656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.47662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.47693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.47705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.47752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.47773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.47882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.61276: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:10.608042", "end": "2024-09-20 21:39:10.611353", "delta": "0:00:00.003311", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882750.62446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882750.62492: stderr chunk (state=3): >>><<< 27844 1726882750.62495: stdout chunk (state=3): >>><<< 27844 1726882750.62508: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:10.608042", "end": "2024-09-20 21:39:10.611353", "delta": "0:00:00.003311", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882750.62540: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882750.62547: _low_level_execute_command(): starting 27844 1726882750.62550: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882750.3731554-28381-51048196473969/ > /dev/null 2>&1 && sleep 0' 27844 1726882750.62972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.62987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.63006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882750.63019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.63033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.63075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.63089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.63200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.64976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.65023: stderr chunk (state=3): >>><<< 27844 1726882750.65026: stdout chunk (state=3): >>><<< 27844 1726882750.65068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.65071: handler run complete 27844 1726882750.65087: Evaluated conditional (False): False 27844 1726882750.65095: attempt loop complete, returning result 27844 1726882750.65098: _execute() done 27844 1726882750.65100: dumping result to json 27844 1726882750.65105: done dumping result, returning 27844 1726882750.65111: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-efa9-466a-0000000003ab] 27844 1726882750.65116: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ab 27844 1726882750.65217: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ab 27844 1726882750.65220: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003311", "end": "2024-09-20 21:39:10.611353", "rc": 0, "start": "2024-09-20 21:39:10.608042" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 27844 1726882750.65357: no more pending results, returning what we have 27844 1726882750.65360: results queue empty 27844 1726882750.65361: checking for any_errors_fatal 27844 1726882750.65362: done checking for any_errors_fatal 27844 1726882750.65372: checking for max_fail_percentage 27844 1726882750.65375: done checking for max_fail_percentage 27844 1726882750.65375: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.65376: done checking to see if all hosts have failed 27844 1726882750.65377: getting the remaining hosts for this loop 27844 1726882750.65378: done getting the remaining hosts for this loop 27844 1726882750.65381: getting the next task for host managed_node1 27844 1726882750.65387: done getting next task for host managed_node1 27844 1726882750.65389: ^ task is: TASK: Set current_interfaces 27844 1726882750.65394: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.65397: getting variables 27844 1726882750.65398: in VariableManager get_vars() 27844 1726882750.65434: Calling all_inventory to load vars for managed_node1 27844 1726882750.65437: Calling groups_inventory to load vars for managed_node1 27844 1726882750.65438: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.65446: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.65448: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.65449: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.65562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.65684: done with get_vars() 27844 1726882750.65692: done getting variables 27844 1726882750.65735: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:10 -0400 (0:00:00.323) 0:00:09.734 ****** 27844 1726882750.65757: entering _queue_task() for managed_node1/set_fact 27844 1726882750.65941: worker is 1 (out of 1 available) 27844 1726882750.65954: exiting _queue_task() for managed_node1/set_fact 27844 1726882750.65967: done queuing things up, now waiting for results queue to drain 27844 1726882750.65968: waiting for pending results... 27844 1726882750.66131: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 27844 1726882750.66202: in run() - task 0e448fcc-3ce9-efa9-466a-0000000003ac 27844 1726882750.66214: variable 'ansible_search_path' from source: unknown 27844 1726882750.66217: variable 'ansible_search_path' from source: unknown 27844 1726882750.66249: calling self._execute() 27844 1726882750.66317: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.66321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.66328: variable 'omit' from source: magic vars 27844 1726882750.66731: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.66747: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.66757: variable 'omit' from source: magic vars 27844 1726882750.66825: variable 'omit' from source: magic vars 27844 1726882750.66942: variable '_current_interfaces' from source: set_fact 27844 1726882750.67025: variable 'omit' from source: magic vars 27844 1726882750.67072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.67114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.67151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.67180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.67195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.67228: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.67247: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.67254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.67369: Set connection var ansible_shell_type to sh 27844 1726882750.67377: Set connection var ansible_connection to ssh 27844 1726882750.67387: Set connection var ansible_pipelining to False 27844 1726882750.67396: Set connection var ansible_timeout to 10 27844 1726882750.67404: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.67412: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.67439: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.67449: variable 'ansible_connection' from source: unknown 27844 1726882750.67471: variable 'ansible_module_compression' from source: unknown 27844 1726882750.67481: variable 'ansible_shell_type' from source: unknown 27844 1726882750.67488: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.67494: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.67502: variable 'ansible_pipelining' from source: unknown 27844 1726882750.67509: variable 'ansible_timeout' from source: unknown 27844 1726882750.67516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.67670: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.67697: variable 'omit' from source: magic vars 27844 1726882750.67707: starting attempt loop 27844 1726882750.67714: running the handler 27844 1726882750.67729: handler run complete 27844 1726882750.67745: attempt loop complete, returning result 27844 1726882750.67752: _execute() done 27844 1726882750.67758: dumping result to json 27844 1726882750.67771: done dumping result, returning 27844 1726882750.67791: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-efa9-466a-0000000003ac] 27844 1726882750.67803: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ac 27844 1726882750.67915: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ac 27844 1726882750.67918: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 27844 1726882750.67974: no more pending results, returning what we have 27844 1726882750.67977: results queue empty 27844 1726882750.67978: checking for any_errors_fatal 27844 1726882750.67986: done checking for any_errors_fatal 27844 1726882750.67987: checking for max_fail_percentage 27844 1726882750.67988: done checking for max_fail_percentage 27844 1726882750.67989: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.67990: done checking to see if all hosts have failed 27844 1726882750.67991: getting the remaining hosts for this loop 27844 1726882750.67992: done getting the remaining hosts for this loop 27844 1726882750.67996: getting the next task for host managed_node1 27844 1726882750.68005: done getting next task for host managed_node1 27844 1726882750.68007: ^ task is: TASK: Show current_interfaces 27844 1726882750.68011: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.68015: getting variables 27844 1726882750.68016: in VariableManager get_vars() 27844 1726882750.68049: Calling all_inventory to load vars for managed_node1 27844 1726882750.68052: Calling groups_inventory to load vars for managed_node1 27844 1726882750.68054: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.68062: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.68069: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.68071: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.68245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.68365: done with get_vars() 27844 1726882750.68372: done getting variables 27844 1726882750.68414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:10 -0400 (0:00:00.026) 0:00:09.760 ****** 27844 1726882750.68434: entering _queue_task() for managed_node1/debug 27844 1726882750.68596: worker is 1 (out of 1 available) 27844 1726882750.68608: exiting _queue_task() for managed_node1/debug 27844 1726882750.68620: done queuing things up, now waiting for results queue to drain 27844 1726882750.68621: waiting for pending results... 27844 1726882750.68784: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 27844 1726882750.68853: in run() - task 0e448fcc-3ce9-efa9-466a-000000000375 27844 1726882750.68866: variable 'ansible_search_path' from source: unknown 27844 1726882750.68870: variable 'ansible_search_path' from source: unknown 27844 1726882750.68904: calling self._execute() 27844 1726882750.68966: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.68974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.68981: variable 'omit' from source: magic vars 27844 1726882750.69241: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.69251: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.69257: variable 'omit' from source: magic vars 27844 1726882750.69293: variable 'omit' from source: magic vars 27844 1726882750.69360: variable 'current_interfaces' from source: set_fact 27844 1726882750.69384: variable 'omit' from source: magic vars 27844 1726882750.69413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.69441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.69455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.69472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.69482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.69503: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.69506: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.69509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.69579: Set connection var ansible_shell_type to sh 27844 1726882750.69583: Set connection var ansible_connection to ssh 27844 1726882750.69585: Set connection var ansible_pipelining to False 27844 1726882750.69591: Set connection var ansible_timeout to 10 27844 1726882750.69596: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.69602: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.69621: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.69625: variable 'ansible_connection' from source: unknown 27844 1726882750.69627: variable 'ansible_module_compression' from source: unknown 27844 1726882750.69629: variable 'ansible_shell_type' from source: unknown 27844 1726882750.69631: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.69633: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.69637: variable 'ansible_pipelining' from source: unknown 27844 1726882750.69639: variable 'ansible_timeout' from source: unknown 27844 1726882750.69643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.69744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.69754: variable 'omit' from source: magic vars 27844 1726882750.69763: starting attempt loop 27844 1726882750.69773: running the handler 27844 1726882750.69808: handler run complete 27844 1726882750.69820: attempt loop complete, returning result 27844 1726882750.69823: _execute() done 27844 1726882750.69826: dumping result to json 27844 1726882750.69828: done dumping result, returning 27844 1726882750.69833: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-efa9-466a-000000000375] 27844 1726882750.69839: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000375 27844 1726882750.69917: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000375 27844 1726882750.69920: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 27844 1726882750.69984: no more pending results, returning what we have 27844 1726882750.69987: results queue empty 27844 1726882750.69988: checking for any_errors_fatal 27844 1726882750.70056: done checking for any_errors_fatal 27844 1726882750.70058: checking for max_fail_percentage 27844 1726882750.70060: done checking for max_fail_percentage 27844 1726882750.70060: checking to see if all hosts have failed and the running result is not ok 27844 1726882750.70061: done checking to see if all hosts have failed 27844 1726882750.70062: getting the remaining hosts for this loop 27844 1726882750.70065: done getting the remaining hosts for this loop 27844 1726882750.70069: getting the next task for host managed_node1 27844 1726882750.70081: done getting next task for host managed_node1 27844 1726882750.70084: ^ task is: TASK: Install iproute 27844 1726882750.70086: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882750.70096: getting variables 27844 1726882750.70098: in VariableManager get_vars() 27844 1726882750.70132: Calling all_inventory to load vars for managed_node1 27844 1726882750.70135: Calling groups_inventory to load vars for managed_node1 27844 1726882750.70138: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882750.70146: Calling all_plugins_play to load vars for managed_node1 27844 1726882750.70149: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882750.70152: Calling groups_plugins_play to load vars for managed_node1 27844 1726882750.70329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882750.70535: done with get_vars() 27844 1726882750.70544: done getting variables 27844 1726882750.70600: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:10 -0400 (0:00:00.021) 0:00:09.782 ****** 27844 1726882750.70627: entering _queue_task() for managed_node1/package 27844 1726882750.71210: worker is 1 (out of 1 available) 27844 1726882750.71221: exiting _queue_task() for managed_node1/package 27844 1726882750.71232: done queuing things up, now waiting for results queue to drain 27844 1726882750.71234: waiting for pending results... 27844 1726882750.71494: running TaskExecutor() for managed_node1/TASK: Install iproute 27844 1726882750.71587: in run() - task 0e448fcc-3ce9-efa9-466a-0000000002ff 27844 1726882750.71605: variable 'ansible_search_path' from source: unknown 27844 1726882750.71611: variable 'ansible_search_path' from source: unknown 27844 1726882750.71649: calling self._execute() 27844 1726882750.71738: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.71748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.71761: variable 'omit' from source: magic vars 27844 1726882750.72196: variable 'ansible_distribution_major_version' from source: facts 27844 1726882750.72216: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882750.72227: variable 'omit' from source: magic vars 27844 1726882750.72271: variable 'omit' from source: magic vars 27844 1726882750.72462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882750.74915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882750.74979: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882750.75018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882750.75068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882750.75117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882750.75216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882750.75257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882750.75295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882750.75342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882750.75371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882750.75480: variable '__network_is_ostree' from source: set_fact 27844 1726882750.75490: variable 'omit' from source: magic vars 27844 1726882750.75520: variable 'omit' from source: magic vars 27844 1726882750.75550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882750.75588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882750.75610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882750.75633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.75647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882750.75689: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882750.75698: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.75705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.75811: Set connection var ansible_shell_type to sh 27844 1726882750.75819: Set connection var ansible_connection to ssh 27844 1726882750.75829: Set connection var ansible_pipelining to False 27844 1726882750.75839: Set connection var ansible_timeout to 10 27844 1726882750.75848: Set connection var ansible_shell_executable to /bin/sh 27844 1726882750.75859: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882750.75896: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.75906: variable 'ansible_connection' from source: unknown 27844 1726882750.75914: variable 'ansible_module_compression' from source: unknown 27844 1726882750.75921: variable 'ansible_shell_type' from source: unknown 27844 1726882750.75927: variable 'ansible_shell_executable' from source: unknown 27844 1726882750.75933: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882750.75940: variable 'ansible_pipelining' from source: unknown 27844 1726882750.75947: variable 'ansible_timeout' from source: unknown 27844 1726882750.75954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882750.76059: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882750.76080: variable 'omit' from source: magic vars 27844 1726882750.76090: starting attempt loop 27844 1726882750.76097: running the handler 27844 1726882750.76107: variable 'ansible_facts' from source: unknown 27844 1726882750.76118: variable 'ansible_facts' from source: unknown 27844 1726882750.76154: _low_level_execute_command(): starting 27844 1726882750.76173: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882750.76920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.76937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.76954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.76983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.77032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.77046: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.77062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.77086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.77102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.77115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.77128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.77142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.77158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.77176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.77189: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.77204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.77286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.77319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.77339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.77477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.79151: stdout chunk (state=3): >>>/root <<< 27844 1726882750.79261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.79367: stderr chunk (state=3): >>><<< 27844 1726882750.79388: stdout chunk (state=3): >>><<< 27844 1726882750.79519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.79523: _low_level_execute_command(): starting 27844 1726882750.79527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599 `" && echo ansible-tmp-1726882750.7942595-28394-165239064248599="` echo /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599 `" ) && sleep 0' 27844 1726882750.81303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.81316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.81330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.81358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.81408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.81423: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.81439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.81473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.81487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.81499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.81511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.81524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.81538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.81549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.81571: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.81585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.81660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.81695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.81711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.81836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.83710: stdout chunk (state=3): >>>ansible-tmp-1726882750.7942595-28394-165239064248599=/root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599 <<< 27844 1726882750.83885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.83888: stdout chunk (state=3): >>><<< 27844 1726882750.83891: stderr chunk (state=3): >>><<< 27844 1726882750.84277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882750.7942595-28394-165239064248599=/root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.84280: variable 'ansible_module_compression' from source: unknown 27844 1726882750.84282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 27844 1726882750.84284: variable 'ansible_facts' from source: unknown 27844 1726882750.84340: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/AnsiballZ_dnf.py 27844 1726882750.84944: Sending initial data 27844 1726882750.84953: Sent initial data (152 bytes) 27844 1726882750.86975: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.86984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.86995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.87008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.87044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.87081: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.87092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.87107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.87250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.87257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.87270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.87278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.87290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.87297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.87304: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.87313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.87384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.87402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.87414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.87533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.89290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882750.89507: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882750.89606: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpj0tjpe0n /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/AnsiballZ_dnf.py <<< 27844 1726882750.89700: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882750.91973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.92045: stderr chunk (state=3): >>><<< 27844 1726882750.92049: stdout chunk (state=3): >>><<< 27844 1726882750.92073: done transferring module to remote 27844 1726882750.92093: _low_level_execute_command(): starting 27844 1726882750.92124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/ /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/AnsiballZ_dnf.py && sleep 0' 27844 1726882750.92795: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882750.92805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.92817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.92832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.92881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.92889: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882750.92898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.92911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882750.92918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882750.92924: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882750.92932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.92941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.92951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.92958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882750.92969: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882750.92981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.93057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.93074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.93081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.93204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882750.95009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882750.95012: stderr chunk (state=3): >>><<< 27844 1726882750.95015: stdout chunk (state=3): >>><<< 27844 1726882750.95028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882750.95031: _low_level_execute_command(): starting 27844 1726882750.95037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/AnsiballZ_dnf.py && sleep 0' 27844 1726882750.95645: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.95651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882750.95702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.95708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882750.95722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882750.95728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882750.95811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882750.95814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882750.95826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882750.95949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882751.97854: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27844 1726882752.03535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882752.03539: stderr chunk (state=3): >>><<< 27844 1726882752.03546: stdout chunk (state=3): >>><<< 27844 1726882752.03566: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882752.03614: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882752.03620: _low_level_execute_command(): starting 27844 1726882752.03625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882750.7942595-28394-165239064248599/ > /dev/null 2>&1 && sleep 0' 27844 1726882752.04790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.04806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.04822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.04840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.04894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.04907: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.04921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.04939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.04950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.04961: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.04985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.04999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.05016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.05029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.05039: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.05054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.05136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.05153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.05171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.05312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.07291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.07357: stderr chunk (state=3): >>><<< 27844 1726882752.07360: stdout chunk (state=3): >>><<< 27844 1726882752.07469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.07473: handler run complete 27844 1726882752.07572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882752.07749: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882752.07793: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882752.07823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882752.07851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882752.07933: variable '__install_status' from source: set_fact 27844 1726882752.07961: Evaluated conditional (__install_status is success): True 27844 1726882752.07989: attempt loop complete, returning result 27844 1726882752.07997: _execute() done 27844 1726882752.08004: dumping result to json 27844 1726882752.08013: done dumping result, returning 27844 1726882752.08025: done running TaskExecutor() for managed_node1/TASK: Install iproute [0e448fcc-3ce9-efa9-466a-0000000002ff] 27844 1726882752.08038: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002ff ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27844 1726882752.08318: no more pending results, returning what we have 27844 1726882752.08321: results queue empty 27844 1726882752.08322: checking for any_errors_fatal 27844 1726882752.08326: done checking for any_errors_fatal 27844 1726882752.08327: checking for max_fail_percentage 27844 1726882752.08329: done checking for max_fail_percentage 27844 1726882752.08330: checking to see if all hosts have failed and the running result is not ok 27844 1726882752.08331: done checking to see if all hosts have failed 27844 1726882752.08332: getting the remaining hosts for this loop 27844 1726882752.08333: done getting the remaining hosts for this loop 27844 1726882752.08337: getting the next task for host managed_node1 27844 1726882752.08343: done getting next task for host managed_node1 27844 1726882752.08346: ^ task is: TASK: Create veth interface {{ interface }} 27844 1726882752.08349: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882752.08352: getting variables 27844 1726882752.08354: in VariableManager get_vars() 27844 1726882752.08390: Calling all_inventory to load vars for managed_node1 27844 1726882752.08393: Calling groups_inventory to load vars for managed_node1 27844 1726882752.08395: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882752.08407: Calling all_plugins_play to load vars for managed_node1 27844 1726882752.08410: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882752.08413: Calling groups_plugins_play to load vars for managed_node1 27844 1726882752.08573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882752.08770: done with get_vars() 27844 1726882752.08781: done getting variables 27844 1726882752.08839: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882752.09160: variable 'interface' from source: set_fact 27844 1726882752.09299: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000002ff 27844 1726882752.09302: WORKER PROCESS EXITING TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:12 -0400 (0:00:01.386) 0:00:11.169 ****** 27844 1726882752.09315: entering _queue_task() for managed_node1/command 27844 1726882752.09752: worker is 1 (out of 1 available) 27844 1726882752.09769: exiting _queue_task() for managed_node1/command 27844 1726882752.09782: done queuing things up, now waiting for results queue to drain 27844 1726882752.09784: waiting for pending results... 27844 1726882752.10777: running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest1 27844 1726882752.10880: in run() - task 0e448fcc-3ce9-efa9-466a-000000000300 27844 1726882752.10898: variable 'ansible_search_path' from source: unknown 27844 1726882752.10904: variable 'ansible_search_path' from source: unknown 27844 1726882752.11146: variable 'interface' from source: set_fact 27844 1726882752.11234: variable 'interface' from source: set_fact 27844 1726882752.11315: variable 'interface' from source: set_fact 27844 1726882752.11457: Loaded config def from plugin (lookup/items) 27844 1726882752.11471: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27844 1726882752.11500: variable 'omit' from source: magic vars 27844 1726882752.11617: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.11631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.11644: variable 'omit' from source: magic vars 27844 1726882752.11868: variable 'ansible_distribution_major_version' from source: facts 27844 1726882752.11882: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882752.12080: variable 'type' from source: set_fact 27844 1726882752.12090: variable 'state' from source: include params 27844 1726882752.12098: variable 'interface' from source: set_fact 27844 1726882752.12105: variable 'current_interfaces' from source: set_fact 27844 1726882752.12114: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882752.12123: variable 'omit' from source: magic vars 27844 1726882752.12165: variable 'omit' from source: magic vars 27844 1726882752.12209: variable 'item' from source: unknown 27844 1726882752.12286: variable 'item' from source: unknown 27844 1726882752.12305: variable 'omit' from source: magic vars 27844 1726882752.12339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882752.12375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882752.12396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882752.12425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.12636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.12666: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882752.12676: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.12684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.12784: Set connection var ansible_shell_type to sh 27844 1726882752.12792: Set connection var ansible_connection to ssh 27844 1726882752.12801: Set connection var ansible_pipelining to False 27844 1726882752.12809: Set connection var ansible_timeout to 10 27844 1726882752.12818: Set connection var ansible_shell_executable to /bin/sh 27844 1726882752.12826: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882752.12856: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.12866: variable 'ansible_connection' from source: unknown 27844 1726882752.12874: variable 'ansible_module_compression' from source: unknown 27844 1726882752.12879: variable 'ansible_shell_type' from source: unknown 27844 1726882752.12885: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.12891: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.12898: variable 'ansible_pipelining' from source: unknown 27844 1726882752.12904: variable 'ansible_timeout' from source: unknown 27844 1726882752.12910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.13030: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882752.13084: variable 'omit' from source: magic vars 27844 1726882752.13178: starting attempt loop 27844 1726882752.13185: running the handler 27844 1726882752.13201: _low_level_execute_command(): starting 27844 1726882752.13211: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882752.14547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.14562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.14580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.14603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.14643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.14656: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.14675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.14696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.14711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.14723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.14736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.14751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.14769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.14783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.14794: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.14808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.14886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.14903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.14918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.15041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.16616: stdout chunk (state=3): >>>/root <<< 27844 1726882752.16709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.16774: stderr chunk (state=3): >>><<< 27844 1726882752.16777: stdout chunk (state=3): >>><<< 27844 1726882752.16878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.16889: _low_level_execute_command(): starting 27844 1726882752.16891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598 `" && echo ansible-tmp-1726882752.1679573-28480-90388818731598="` echo /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598 `" ) && sleep 0' 27844 1726882752.17460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.17477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.17491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.17508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.17555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.17570: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.17584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.17600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.17610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.17619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.17633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.17646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.17666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.17678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.17688: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.17700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.17786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.17807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.17823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.17946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.19793: stdout chunk (state=3): >>>ansible-tmp-1726882752.1679573-28480-90388818731598=/root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598 <<< 27844 1726882752.19979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.19982: stdout chunk (state=3): >>><<< 27844 1726882752.19984: stderr chunk (state=3): >>><<< 27844 1726882752.20273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882752.1679573-28480-90388818731598=/root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.20277: variable 'ansible_module_compression' from source: unknown 27844 1726882752.20280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882752.20392: variable 'ansible_facts' from source: unknown 27844 1726882752.20470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/AnsiballZ_command.py 27844 1726882752.20624: Sending initial data 27844 1726882752.20627: Sent initial data (155 bytes) 27844 1726882752.21584: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.21598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.21613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.21631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.21675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.21694: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.21710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.21729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.21741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.21753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.21770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.21787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.21809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.21823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.21835: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.21849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.21933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.21954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.21975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.22096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.23811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882752.23901: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882752.24001: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp9wjuvun5 /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/AnsiballZ_command.py <<< 27844 1726882752.24094: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882752.25434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.25609: stderr chunk (state=3): >>><<< 27844 1726882752.25612: stdout chunk (state=3): >>><<< 27844 1726882752.25614: done transferring module to remote 27844 1726882752.25616: _low_level_execute_command(): starting 27844 1726882752.25618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/ /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/AnsiballZ_command.py && sleep 0' 27844 1726882752.26198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.26211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.26224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.26240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.26289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.26304: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.26320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.26337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.26350: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.26368: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.26383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.26402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.26420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.26433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.26536: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.26551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.26622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.26646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.26661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.26794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.28574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.28596: stdout chunk (state=3): >>><<< 27844 1726882752.28599: stderr chunk (state=3): >>><<< 27844 1726882752.28684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.28688: _low_level_execute_command(): starting 27844 1726882752.28691: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/AnsiballZ_command.py && sleep 0' 27844 1726882752.30001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.30124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.30141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.30160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.30204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.30215: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.30233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.30251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.30262: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.30281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.30295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.30310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.30326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.30342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.30352: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.30367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.30440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.30472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.30489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.30615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.44691: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 21:39:12.435518", "end": "2024-09-20 21:39:12.445048", "delta": "0:00:00.009530", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882752.46596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882752.46600: stdout chunk (state=3): >>><<< 27844 1726882752.46603: stderr chunk (state=3): >>><<< 27844 1726882752.46673: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-20 21:39:12.435518", "end": "2024-09-20 21:39:12.445048", "delta": "0:00:00.009530", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882752.46683: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882752.46762: _low_level_execute_command(): starting 27844 1726882752.46773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882752.1679573-28480-90388818731598/ > /dev/null 2>&1 && sleep 0' 27844 1726882752.47775: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.47779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.47810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.47813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.47816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.47887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.47890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.47998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.51218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.51291: stderr chunk (state=3): >>><<< 27844 1726882752.51294: stdout chunk (state=3): >>><<< 27844 1726882752.51471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.51476: handler run complete 27844 1726882752.51478: Evaluated conditional (False): False 27844 1726882752.51480: attempt loop complete, returning result 27844 1726882752.51482: variable 'item' from source: unknown 27844 1726882752.51484: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.009530", "end": "2024-09-20 21:39:12.445048", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-20 21:39:12.435518" } 27844 1726882752.51772: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.51776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.51843: variable 'omit' from source: magic vars 27844 1726882752.51968: variable 'ansible_distribution_major_version' from source: facts 27844 1726882752.51980: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882752.52182: variable 'type' from source: set_fact 27844 1726882752.52192: variable 'state' from source: include params 27844 1726882752.52201: variable 'interface' from source: set_fact 27844 1726882752.52210: variable 'current_interfaces' from source: set_fact 27844 1726882752.52220: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882752.52229: variable 'omit' from source: magic vars 27844 1726882752.52248: variable 'omit' from source: magic vars 27844 1726882752.52305: variable 'item' from source: unknown 27844 1726882752.52370: variable 'item' from source: unknown 27844 1726882752.52402: variable 'omit' from source: magic vars 27844 1726882752.52431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882752.52444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.52454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.52476: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882752.52484: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.52492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.52576: Set connection var ansible_shell_type to sh 27844 1726882752.52583: Set connection var ansible_connection to ssh 27844 1726882752.52592: Set connection var ansible_pipelining to False 27844 1726882752.52601: Set connection var ansible_timeout to 10 27844 1726882752.52615: Set connection var ansible_shell_executable to /bin/sh 27844 1726882752.52624: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882752.52650: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.52656: variable 'ansible_connection' from source: unknown 27844 1726882752.52662: variable 'ansible_module_compression' from source: unknown 27844 1726882752.52675: variable 'ansible_shell_type' from source: unknown 27844 1726882752.52682: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.52689: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.52696: variable 'ansible_pipelining' from source: unknown 27844 1726882752.52702: variable 'ansible_timeout' from source: unknown 27844 1726882752.52709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.52811: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882752.52835: variable 'omit' from source: magic vars 27844 1726882752.52843: starting attempt loop 27844 1726882752.52850: running the handler 27844 1726882752.52861: _low_level_execute_command(): starting 27844 1726882752.52875: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882752.53526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.53541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.53557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.53581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.53628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.53640: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.53654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.53678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.53691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.53707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.53719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.53733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.53750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.53762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.53778: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.53790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.53873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.53896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.53916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.54044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.55641: stdout chunk (state=3): >>>/root <<< 27844 1726882752.55833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.55836: stdout chunk (state=3): >>><<< 27844 1726882752.55838: stderr chunk (state=3): >>><<< 27844 1726882752.55871: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.55875: _low_level_execute_command(): starting 27844 1726882752.55946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603 `" && echo ansible-tmp-1726882752.5585272-28480-233883021057603="` echo /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603 `" ) && sleep 0' 27844 1726882752.56519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.56532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.56545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.56571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.56616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.56628: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.56641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.56657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.56674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.56686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.56697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.56714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.56730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.56742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.56752: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.56769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.56848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.56874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.56890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.57018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.58902: stdout chunk (state=3): >>>ansible-tmp-1726882752.5585272-28480-233883021057603=/root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603 <<< 27844 1726882752.59012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.59089: stderr chunk (state=3): >>><<< 27844 1726882752.59092: stdout chunk (state=3): >>><<< 27844 1726882752.59175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882752.5585272-28480-233883021057603=/root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.59185: variable 'ansible_module_compression' from source: unknown 27844 1726882752.59359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882752.59363: variable 'ansible_facts' from source: unknown 27844 1726882752.59370: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/AnsiballZ_command.py 27844 1726882752.59428: Sending initial data 27844 1726882752.59431: Sent initial data (156 bytes) 27844 1726882752.60416: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.60429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.60448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.60475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.60515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.60528: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.60542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.60571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.60587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.60598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.60610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.60624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.60639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.60651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.60662: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.60687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.60767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.60797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.60813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.60941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.62668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882752.62755: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882752.62851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmprhyx7rdb /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/AnsiballZ_command.py <<< 27844 1726882752.62944: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882752.64271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.64347: stderr chunk (state=3): >>><<< 27844 1726882752.64350: stdout chunk (state=3): >>><<< 27844 1726882752.64371: done transferring module to remote 27844 1726882752.64382: _low_level_execute_command(): starting 27844 1726882752.64385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/ /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/AnsiballZ_command.py && sleep 0' 27844 1726882752.65093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882752.65116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.65770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.65773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.65776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.65778: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882752.65780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.65782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882752.65784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882752.65786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882752.65787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.65789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.65791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.65793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882752.65794: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882752.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.65800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.65802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.65804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.65806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.67279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.67282: stdout chunk (state=3): >>><<< 27844 1726882752.67285: stderr chunk (state=3): >>><<< 27844 1726882752.67335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.67341: _low_level_execute_command(): starting 27844 1726882752.67344: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/AnsiballZ_command.py && sleep 0' 27844 1726882752.67982: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.67989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.68027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.68032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.68047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.68053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.68132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.68136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882752.68148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.68271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.81530: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 21:39:12.810316", "end": "2024-09-20 21:39:12.813754", "delta": "0:00:00.003438", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882752.82658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882752.82709: stderr chunk (state=3): >>><<< 27844 1726882752.82713: stdout chunk (state=3): >>><<< 27844 1726882752.82729: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-20 21:39:12.810316", "end": "2024-09-20 21:39:12.813754", "delta": "0:00:00.003438", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882752.82752: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882752.82757: _low_level_execute_command(): starting 27844 1726882752.82762: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882752.5585272-28480-233883021057603/ > /dev/null 2>&1 && sleep 0' 27844 1726882752.83179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.83182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.83223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.83226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.83229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.83278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.83290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.83394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.85193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.85230: stderr chunk (state=3): >>><<< 27844 1726882752.85234: stdout chunk (state=3): >>><<< 27844 1726882752.85245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.85251: handler run complete 27844 1726882752.85268: Evaluated conditional (False): False 27844 1726882752.85279: attempt loop complete, returning result 27844 1726882752.85294: variable 'item' from source: unknown 27844 1726882752.85352: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.003438", "end": "2024-09-20 21:39:12.813754", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-20 21:39:12.810316" } 27844 1726882752.85470: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.85474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.85477: variable 'omit' from source: magic vars 27844 1726882752.85577: variable 'ansible_distribution_major_version' from source: facts 27844 1726882752.85580: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882752.85703: variable 'type' from source: set_fact 27844 1726882752.85707: variable 'state' from source: include params 27844 1726882752.85709: variable 'interface' from source: set_fact 27844 1726882752.85712: variable 'current_interfaces' from source: set_fact 27844 1726882752.85719: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27844 1726882752.85722: variable 'omit' from source: magic vars 27844 1726882752.85733: variable 'omit' from source: magic vars 27844 1726882752.85760: variable 'item' from source: unknown 27844 1726882752.85812: variable 'item' from source: unknown 27844 1726882752.85819: variable 'omit' from source: magic vars 27844 1726882752.85835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882752.85842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.85847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882752.85857: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882752.85859: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.85862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.85922: Set connection var ansible_shell_type to sh 27844 1726882752.85925: Set connection var ansible_connection to ssh 27844 1726882752.85928: Set connection var ansible_pipelining to False 27844 1726882752.85930: Set connection var ansible_timeout to 10 27844 1726882752.85932: Set connection var ansible_shell_executable to /bin/sh 27844 1726882752.85934: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882752.85950: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.85953: variable 'ansible_connection' from source: unknown 27844 1726882752.85955: variable 'ansible_module_compression' from source: unknown 27844 1726882752.85957: variable 'ansible_shell_type' from source: unknown 27844 1726882752.85960: variable 'ansible_shell_executable' from source: unknown 27844 1726882752.85962: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882752.85969: variable 'ansible_pipelining' from source: unknown 27844 1726882752.85972: variable 'ansible_timeout' from source: unknown 27844 1726882752.85976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882752.86039: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882752.86045: variable 'omit' from source: magic vars 27844 1726882752.86048: starting attempt loop 27844 1726882752.86051: running the handler 27844 1726882752.86057: _low_level_execute_command(): starting 27844 1726882752.86060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882752.86460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.86481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.86493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.86504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.86548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.86560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.86659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.88221: stdout chunk (state=3): >>>/root <<< 27844 1726882752.88324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.88361: stderr chunk (state=3): >>><<< 27844 1726882752.88366: stdout chunk (state=3): >>><<< 27844 1726882752.88380: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.88388: _low_level_execute_command(): starting 27844 1726882752.88393: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179 `" && echo ansible-tmp-1726882752.8837864-28480-123948129147179="` echo /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179 `" ) && sleep 0' 27844 1726882752.88785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.88798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.88814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.88834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.88878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.88893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.88987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.90829: stdout chunk (state=3): >>>ansible-tmp-1726882752.8837864-28480-123948129147179=/root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179 <<< 27844 1726882752.90972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.90987: stderr chunk (state=3): >>><<< 27844 1726882752.90990: stdout chunk (state=3): >>><<< 27844 1726882752.91001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882752.8837864-28480-123948129147179=/root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.91015: variable 'ansible_module_compression' from source: unknown 27844 1726882752.91042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882752.91056: variable 'ansible_facts' from source: unknown 27844 1726882752.91106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/AnsiballZ_command.py 27844 1726882752.91188: Sending initial data 27844 1726882752.91192: Sent initial data (156 bytes) 27844 1726882752.91801: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.91806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.91835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882752.91846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.91857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.91906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.91918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.92017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.93726: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 27844 1726882752.93735: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882752.93816: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882752.93906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpytzjl8lf /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/AnsiballZ_command.py <<< 27844 1726882752.93998: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882752.95000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.95083: stderr chunk (state=3): >>><<< 27844 1726882752.95087: stdout chunk (state=3): >>><<< 27844 1726882752.95100: done transferring module to remote 27844 1726882752.95106: _low_level_execute_command(): starting 27844 1726882752.95110: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/ /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/AnsiballZ_command.py && sleep 0' 27844 1726882752.95495: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.95505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.95530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.95541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.95589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.95601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.95709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882752.97435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882752.97479: stderr chunk (state=3): >>><<< 27844 1726882752.97482: stdout chunk (state=3): >>><<< 27844 1726882752.97493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882752.97496: _low_level_execute_command(): starting 27844 1726882752.97500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/AnsiballZ_command.py && sleep 0' 27844 1726882752.97895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882752.97898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882752.97928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.97931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882752.97933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882752.97993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882752.97998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882752.98092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.11805: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 21:39:13.108801", "end": "2024-09-20 21:39:13.113816", "delta": "0:00:00.005015", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882753.12751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882753.12809: stderr chunk (state=3): >>><<< 27844 1726882753.12813: stdout chunk (state=3): >>><<< 27844 1726882753.12827: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-20 21:39:13.108801", "end": "2024-09-20 21:39:13.113816", "delta": "0:00:00.005015", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882753.12851: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882753.12856: _low_level_execute_command(): starting 27844 1726882753.12860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882752.8837864-28480-123948129147179/ > /dev/null 2>&1 && sleep 0' 27844 1726882753.13318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.13323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.13357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.13374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.13422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.13444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.13535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.15329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.15372: stderr chunk (state=3): >>><<< 27844 1726882753.15377: stdout chunk (state=3): >>><<< 27844 1726882753.15390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.15395: handler run complete 27844 1726882753.15409: Evaluated conditional (False): False 27844 1726882753.15421: attempt loop complete, returning result 27844 1726882753.15436: variable 'item' from source: unknown 27844 1726882753.15498: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.005015", "end": "2024-09-20 21:39:13.113816", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-20 21:39:13.108801" } 27844 1726882753.15617: dumping result to json 27844 1726882753.15620: done dumping result, returning 27844 1726882753.15621: done running TaskExecutor() for managed_node1/TASK: Create veth interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000300] 27844 1726882753.15623: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000300 27844 1726882753.15673: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000300 27844 1726882753.15676: WORKER PROCESS EXITING 27844 1726882753.15736: no more pending results, returning what we have 27844 1726882753.15740: results queue empty 27844 1726882753.15741: checking for any_errors_fatal 27844 1726882753.15745: done checking for any_errors_fatal 27844 1726882753.15746: checking for max_fail_percentage 27844 1726882753.15747: done checking for max_fail_percentage 27844 1726882753.15748: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.15749: done checking to see if all hosts have failed 27844 1726882753.15750: getting the remaining hosts for this loop 27844 1726882753.15751: done getting the remaining hosts for this loop 27844 1726882753.15754: getting the next task for host managed_node1 27844 1726882753.15759: done getting next task for host managed_node1 27844 1726882753.15761: ^ task is: TASK: Set up veth as managed by NetworkManager 27844 1726882753.15768: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.15772: getting variables 27844 1726882753.15773: in VariableManager get_vars() 27844 1726882753.15810: Calling all_inventory to load vars for managed_node1 27844 1726882753.15813: Calling groups_inventory to load vars for managed_node1 27844 1726882753.15815: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.15824: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.15826: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.15829: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.16134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.16247: done with get_vars() 27844 1726882753.16254: done getting variables 27844 1726882753.16298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:13 -0400 (0:00:01.070) 0:00:12.239 ****** 27844 1726882753.16319: entering _queue_task() for managed_node1/command 27844 1726882753.16496: worker is 1 (out of 1 available) 27844 1726882753.16508: exiting _queue_task() for managed_node1/command 27844 1726882753.16520: done queuing things up, now waiting for results queue to drain 27844 1726882753.16521: waiting for pending results... 27844 1726882753.16681: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 27844 1726882753.16747: in run() - task 0e448fcc-3ce9-efa9-466a-000000000301 27844 1726882753.16759: variable 'ansible_search_path' from source: unknown 27844 1726882753.16762: variable 'ansible_search_path' from source: unknown 27844 1726882753.16796: calling self._execute() 27844 1726882753.16861: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.16871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.16879: variable 'omit' from source: magic vars 27844 1726882753.17144: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.17154: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.17266: variable 'type' from source: set_fact 27844 1726882753.17272: variable 'state' from source: include params 27844 1726882753.17278: Evaluated conditional (type == 'veth' and state == 'present'): True 27844 1726882753.17284: variable 'omit' from source: magic vars 27844 1726882753.17312: variable 'omit' from source: magic vars 27844 1726882753.17379: variable 'interface' from source: set_fact 27844 1726882753.17391: variable 'omit' from source: magic vars 27844 1726882753.17427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882753.17452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882753.17471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882753.17485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.17494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.17517: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882753.17525: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.17528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.17596: Set connection var ansible_shell_type to sh 27844 1726882753.17599: Set connection var ansible_connection to ssh 27844 1726882753.17602: Set connection var ansible_pipelining to False 27844 1726882753.17608: Set connection var ansible_timeout to 10 27844 1726882753.17614: Set connection var ansible_shell_executable to /bin/sh 27844 1726882753.17623: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882753.17644: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.17647: variable 'ansible_connection' from source: unknown 27844 1726882753.17650: variable 'ansible_module_compression' from source: unknown 27844 1726882753.17652: variable 'ansible_shell_type' from source: unknown 27844 1726882753.17654: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.17656: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.17659: variable 'ansible_pipelining' from source: unknown 27844 1726882753.17661: variable 'ansible_timeout' from source: unknown 27844 1726882753.17666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.17768: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882753.17779: variable 'omit' from source: magic vars 27844 1726882753.17783: starting attempt loop 27844 1726882753.17786: running the handler 27844 1726882753.17798: _low_level_execute_command(): starting 27844 1726882753.17805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882753.18327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.18342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.18354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.18372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.18390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.18426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.18438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.18543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.20103: stdout chunk (state=3): >>>/root <<< 27844 1726882753.20209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.20253: stderr chunk (state=3): >>><<< 27844 1726882753.20256: stdout chunk (state=3): >>><<< 27844 1726882753.20281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.20291: _low_level_execute_command(): starting 27844 1726882753.20295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871 `" && echo ansible-tmp-1726882753.202791-28530-8619525948871="` echo /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871 `" ) && sleep 0' 27844 1726882753.20716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.20728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.20759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.20778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.20810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.20822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.20923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.22757: stdout chunk (state=3): >>>ansible-tmp-1726882753.202791-28530-8619525948871=/root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871 <<< 27844 1726882753.22872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.22913: stderr chunk (state=3): >>><<< 27844 1726882753.22916: stdout chunk (state=3): >>><<< 27844 1726882753.22931: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882753.202791-28530-8619525948871=/root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.22953: variable 'ansible_module_compression' from source: unknown 27844 1726882753.22999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882753.23031: variable 'ansible_facts' from source: unknown 27844 1726882753.23095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/AnsiballZ_command.py 27844 1726882753.23195: Sending initial data 27844 1726882753.23210: Sent initial data (153 bytes) 27844 1726882753.23825: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.23837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.23860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.23878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.23921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.23932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.24029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.25732: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882753.25823: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882753.25916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp8fip89rs /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/AnsiballZ_command.py <<< 27844 1726882753.26003: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882753.27000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.27086: stderr chunk (state=3): >>><<< 27844 1726882753.27090: stdout chunk (state=3): >>><<< 27844 1726882753.27109: done transferring module to remote 27844 1726882753.27117: _low_level_execute_command(): starting 27844 1726882753.27121: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/ /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/AnsiballZ_command.py && sleep 0' 27844 1726882753.27517: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.27539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.27554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.27568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.27608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.27619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.27716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.29424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.29462: stderr chunk (state=3): >>><<< 27844 1726882753.29476: stdout chunk (state=3): >>><<< 27844 1726882753.29488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.29492: _low_level_execute_command(): starting 27844 1726882753.29494: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/AnsiballZ_command.py && sleep 0' 27844 1726882753.29896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.29917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.29929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.29939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.29985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.29997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.30100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.45237: stdout chunk (state=3): >>> <<< 27844 1726882753.45243: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 21:39:13.429324", "end": "2024-09-20 21:39:13.450282", "delta": "0:00:00.020958", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882753.46485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882753.46541: stderr chunk (state=3): >>><<< 27844 1726882753.46544: stdout chunk (state=3): >>><<< 27844 1726882753.46561: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-20 21:39:13.429324", "end": "2024-09-20 21:39:13.450282", "delta": "0:00:00.020958", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882753.46594: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882753.46600: _low_level_execute_command(): starting 27844 1726882753.46605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882753.202791-28530-8619525948871/ > /dev/null 2>&1 && sleep 0' 27844 1726882753.47071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.47075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.47113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.47116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.47119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882753.47121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.47166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.47180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.47283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.49062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.49111: stderr chunk (state=3): >>><<< 27844 1726882753.49117: stdout chunk (state=3): >>><<< 27844 1726882753.49131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.49137: handler run complete 27844 1726882753.49154: Evaluated conditional (False): False 27844 1726882753.49161: attempt loop complete, returning result 27844 1726882753.49166: _execute() done 27844 1726882753.49171: dumping result to json 27844 1726882753.49176: done dumping result, returning 27844 1726882753.49185: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-efa9-466a-000000000301] 27844 1726882753.49187: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000301 27844 1726882753.49285: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000301 27844 1726882753.49288: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.020958", "end": "2024-09-20 21:39:13.450282", "rc": 0, "start": "2024-09-20 21:39:13.429324" } 27844 1726882753.49348: no more pending results, returning what we have 27844 1726882753.49352: results queue empty 27844 1726882753.49352: checking for any_errors_fatal 27844 1726882753.49369: done checking for any_errors_fatal 27844 1726882753.49370: checking for max_fail_percentage 27844 1726882753.49372: done checking for max_fail_percentage 27844 1726882753.49373: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.49374: done checking to see if all hosts have failed 27844 1726882753.49374: getting the remaining hosts for this loop 27844 1726882753.49376: done getting the remaining hosts for this loop 27844 1726882753.49379: getting the next task for host managed_node1 27844 1726882753.49384: done getting next task for host managed_node1 27844 1726882753.49387: ^ task is: TASK: Delete veth interface {{ interface }} 27844 1726882753.49389: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.49393: getting variables 27844 1726882753.49394: in VariableManager get_vars() 27844 1726882753.49432: Calling all_inventory to load vars for managed_node1 27844 1726882753.49434: Calling groups_inventory to load vars for managed_node1 27844 1726882753.49436: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.49446: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.49448: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.49451: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.49581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.49708: done with get_vars() 27844 1726882753.49718: done getting variables 27844 1726882753.49759: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.49848: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:13 -0400 (0:00:00.335) 0:00:12.575 ****** 27844 1726882753.49873: entering _queue_task() for managed_node1/command 27844 1726882753.50050: worker is 1 (out of 1 available) 27844 1726882753.50062: exiting _queue_task() for managed_node1/command 27844 1726882753.50077: done queuing things up, now waiting for results queue to drain 27844 1726882753.50079: waiting for pending results... 27844 1726882753.50238: running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest1 27844 1726882753.50304: in run() - task 0e448fcc-3ce9-efa9-466a-000000000302 27844 1726882753.50314: variable 'ansible_search_path' from source: unknown 27844 1726882753.50318: variable 'ansible_search_path' from source: unknown 27844 1726882753.50347: calling self._execute() 27844 1726882753.50417: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.50420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.50428: variable 'omit' from source: magic vars 27844 1726882753.50691: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.50703: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.50835: variable 'type' from source: set_fact 27844 1726882753.50838: variable 'state' from source: include params 27844 1726882753.50842: variable 'interface' from source: set_fact 27844 1726882753.50844: variable 'current_interfaces' from source: set_fact 27844 1726882753.50851: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27844 1726882753.50854: when evaluation is False, skipping this task 27844 1726882753.50858: _execute() done 27844 1726882753.50861: dumping result to json 27844 1726882753.50863: done dumping result, returning 27844 1726882753.50871: done running TaskExecutor() for managed_node1/TASK: Delete veth interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000302] 27844 1726882753.50877: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000302 27844 1726882753.50953: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000302 27844 1726882753.50957: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882753.51011: no more pending results, returning what we have 27844 1726882753.51014: results queue empty 27844 1726882753.51015: checking for any_errors_fatal 27844 1726882753.51021: done checking for any_errors_fatal 27844 1726882753.51022: checking for max_fail_percentage 27844 1726882753.51023: done checking for max_fail_percentage 27844 1726882753.51024: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.51024: done checking to see if all hosts have failed 27844 1726882753.51025: getting the remaining hosts for this loop 27844 1726882753.51026: done getting the remaining hosts for this loop 27844 1726882753.51029: getting the next task for host managed_node1 27844 1726882753.51033: done getting next task for host managed_node1 27844 1726882753.51036: ^ task is: TASK: Create dummy interface {{ interface }} 27844 1726882753.51038: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.51041: getting variables 27844 1726882753.51043: in VariableManager get_vars() 27844 1726882753.51075: Calling all_inventory to load vars for managed_node1 27844 1726882753.51078: Calling groups_inventory to load vars for managed_node1 27844 1726882753.51079: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.51086: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.51087: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.51089: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.51195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.51349: done with get_vars() 27844 1726882753.51355: done getting variables 27844 1726882753.51397: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.51469: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:13 -0400 (0:00:00.016) 0:00:12.591 ****** 27844 1726882753.51490: entering _queue_task() for managed_node1/command 27844 1726882753.51646: worker is 1 (out of 1 available) 27844 1726882753.51659: exiting _queue_task() for managed_node1/command 27844 1726882753.51675: done queuing things up, now waiting for results queue to drain 27844 1726882753.51677: waiting for pending results... 27844 1726882753.51817: running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest1 27844 1726882753.51880: in run() - task 0e448fcc-3ce9-efa9-466a-000000000303 27844 1726882753.51890: variable 'ansible_search_path' from source: unknown 27844 1726882753.51893: variable 'ansible_search_path' from source: unknown 27844 1726882753.51920: calling self._execute() 27844 1726882753.51988: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.51992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.52000: variable 'omit' from source: magic vars 27844 1726882753.52240: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.52250: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.52379: variable 'type' from source: set_fact 27844 1726882753.52388: variable 'state' from source: include params 27844 1726882753.52395: variable 'interface' from source: set_fact 27844 1726882753.52398: variable 'current_interfaces' from source: set_fact 27844 1726882753.52405: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27844 1726882753.52408: when evaluation is False, skipping this task 27844 1726882753.52411: _execute() done 27844 1726882753.52413: dumping result to json 27844 1726882753.52415: done dumping result, returning 27844 1726882753.52422: done running TaskExecutor() for managed_node1/TASK: Create dummy interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000303] 27844 1726882753.52426: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000303 27844 1726882753.52505: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000303 27844 1726882753.52508: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882753.52584: no more pending results, returning what we have 27844 1726882753.52587: results queue empty 27844 1726882753.52587: checking for any_errors_fatal 27844 1726882753.52590: done checking for any_errors_fatal 27844 1726882753.52591: checking for max_fail_percentage 27844 1726882753.52592: done checking for max_fail_percentage 27844 1726882753.52593: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.52594: done checking to see if all hosts have failed 27844 1726882753.52594: getting the remaining hosts for this loop 27844 1726882753.52595: done getting the remaining hosts for this loop 27844 1726882753.52597: getting the next task for host managed_node1 27844 1726882753.52601: done getting next task for host managed_node1 27844 1726882753.52602: ^ task is: TASK: Delete dummy interface {{ interface }} 27844 1726882753.52605: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.52607: getting variables 27844 1726882753.52608: in VariableManager get_vars() 27844 1726882753.52633: Calling all_inventory to load vars for managed_node1 27844 1726882753.52635: Calling groups_inventory to load vars for managed_node1 27844 1726882753.52636: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.52643: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.52644: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.52646: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.52750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.52876: done with get_vars() 27844 1726882753.52882: done getting variables 27844 1726882753.52920: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.52993: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:13 -0400 (0:00:00.015) 0:00:12.606 ****** 27844 1726882753.53012: entering _queue_task() for managed_node1/command 27844 1726882753.53161: worker is 1 (out of 1 available) 27844 1726882753.53175: exiting _queue_task() for managed_node1/command 27844 1726882753.53188: done queuing things up, now waiting for results queue to drain 27844 1726882753.53190: waiting for pending results... 27844 1726882753.53324: running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest1 27844 1726882753.53381: in run() - task 0e448fcc-3ce9-efa9-466a-000000000304 27844 1726882753.53390: variable 'ansible_search_path' from source: unknown 27844 1726882753.53394: variable 'ansible_search_path' from source: unknown 27844 1726882753.53420: calling self._execute() 27844 1726882753.53477: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.53486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.53494: variable 'omit' from source: magic vars 27844 1726882753.53740: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.53752: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.53884: variable 'type' from source: set_fact 27844 1726882753.53887: variable 'state' from source: include params 27844 1726882753.53890: variable 'interface' from source: set_fact 27844 1726882753.53894: variable 'current_interfaces' from source: set_fact 27844 1726882753.53900: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27844 1726882753.53904: when evaluation is False, skipping this task 27844 1726882753.53909: _execute() done 27844 1726882753.53914: dumping result to json 27844 1726882753.53917: done dumping result, returning 27844 1726882753.53923: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000304] 27844 1726882753.53928: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000304 27844 1726882753.54001: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000304 27844 1726882753.54004: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882753.54049: no more pending results, returning what we have 27844 1726882753.54052: results queue empty 27844 1726882753.54053: checking for any_errors_fatal 27844 1726882753.54057: done checking for any_errors_fatal 27844 1726882753.54058: checking for max_fail_percentage 27844 1726882753.54059: done checking for max_fail_percentage 27844 1726882753.54060: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.54061: done checking to see if all hosts have failed 27844 1726882753.54061: getting the remaining hosts for this loop 27844 1726882753.54062: done getting the remaining hosts for this loop 27844 1726882753.54068: getting the next task for host managed_node1 27844 1726882753.54072: done getting next task for host managed_node1 27844 1726882753.54074: ^ task is: TASK: Create tap interface {{ interface }} 27844 1726882753.54077: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.54080: getting variables 27844 1726882753.54081: in VariableManager get_vars() 27844 1726882753.54110: Calling all_inventory to load vars for managed_node1 27844 1726882753.54112: Calling groups_inventory to load vars for managed_node1 27844 1726882753.54113: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.54122: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.54124: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.54127: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.54265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.54378: done with get_vars() 27844 1726882753.54385: done getting variables 27844 1726882753.54421: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.54489: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:13 -0400 (0:00:00.014) 0:00:12.621 ****** 27844 1726882753.54507: entering _queue_task() for managed_node1/command 27844 1726882753.54649: worker is 1 (out of 1 available) 27844 1726882753.54660: exiting _queue_task() for managed_node1/command 27844 1726882753.54674: done queuing things up, now waiting for results queue to drain 27844 1726882753.54676: waiting for pending results... 27844 1726882753.54814: running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest1 27844 1726882753.54873: in run() - task 0e448fcc-3ce9-efa9-466a-000000000305 27844 1726882753.54882: variable 'ansible_search_path' from source: unknown 27844 1726882753.54887: variable 'ansible_search_path' from source: unknown 27844 1726882753.54917: calling self._execute() 27844 1726882753.54975: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.54979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.54986: variable 'omit' from source: magic vars 27844 1726882753.55215: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.55223: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.55356: variable 'type' from source: set_fact 27844 1726882753.55359: variable 'state' from source: include params 27844 1726882753.55367: variable 'interface' from source: set_fact 27844 1726882753.55371: variable 'current_interfaces' from source: set_fact 27844 1726882753.55374: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27844 1726882753.55377: when evaluation is False, skipping this task 27844 1726882753.55379: _execute() done 27844 1726882753.55382: dumping result to json 27844 1726882753.55386: done dumping result, returning 27844 1726882753.55392: done running TaskExecutor() for managed_node1/TASK: Create tap interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000305] 27844 1726882753.55397: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000305 27844 1726882753.55475: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000305 27844 1726882753.55478: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882753.55519: no more pending results, returning what we have 27844 1726882753.55522: results queue empty 27844 1726882753.55523: checking for any_errors_fatal 27844 1726882753.55526: done checking for any_errors_fatal 27844 1726882753.55527: checking for max_fail_percentage 27844 1726882753.55528: done checking for max_fail_percentage 27844 1726882753.55529: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.55530: done checking to see if all hosts have failed 27844 1726882753.55530: getting the remaining hosts for this loop 27844 1726882753.55531: done getting the remaining hosts for this loop 27844 1726882753.55534: getting the next task for host managed_node1 27844 1726882753.55538: done getting next task for host managed_node1 27844 1726882753.55540: ^ task is: TASK: Delete tap interface {{ interface }} 27844 1726882753.55543: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.55546: getting variables 27844 1726882753.55547: in VariableManager get_vars() 27844 1726882753.55578: Calling all_inventory to load vars for managed_node1 27844 1726882753.55581: Calling groups_inventory to load vars for managed_node1 27844 1726882753.55582: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.55590: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.55592: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.55594: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.55696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.55819: done with get_vars() 27844 1726882753.55826: done getting variables 27844 1726882753.55861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.55934: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:13 -0400 (0:00:00.014) 0:00:12.636 ****** 27844 1726882753.55952: entering _queue_task() for managed_node1/command 27844 1726882753.56096: worker is 1 (out of 1 available) 27844 1726882753.56107: exiting _queue_task() for managed_node1/command 27844 1726882753.56118: done queuing things up, now waiting for results queue to drain 27844 1726882753.56119: waiting for pending results... 27844 1726882753.56252: running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest1 27844 1726882753.56308: in run() - task 0e448fcc-3ce9-efa9-466a-000000000306 27844 1726882753.56315: variable 'ansible_search_path' from source: unknown 27844 1726882753.56317: variable 'ansible_search_path' from source: unknown 27844 1726882753.56343: calling self._execute() 27844 1726882753.56409: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.56418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.56426: variable 'omit' from source: magic vars 27844 1726882753.56646: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.56655: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.56783: variable 'type' from source: set_fact 27844 1726882753.56786: variable 'state' from source: include params 27844 1726882753.56791: variable 'interface' from source: set_fact 27844 1726882753.56794: variable 'current_interfaces' from source: set_fact 27844 1726882753.56804: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27844 1726882753.56807: when evaluation is False, skipping this task 27844 1726882753.56810: _execute() done 27844 1726882753.56812: dumping result to json 27844 1726882753.56815: done dumping result, returning 27844 1726882753.56817: done running TaskExecutor() for managed_node1/TASK: Delete tap interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000306] 27844 1726882753.56821: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000306 27844 1726882753.56898: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000306 27844 1726882753.56901: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27844 1726882753.56944: no more pending results, returning what we have 27844 1726882753.56947: results queue empty 27844 1726882753.56948: checking for any_errors_fatal 27844 1726882753.56953: done checking for any_errors_fatal 27844 1726882753.56953: checking for max_fail_percentage 27844 1726882753.56955: done checking for max_fail_percentage 27844 1726882753.56955: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.56956: done checking to see if all hosts have failed 27844 1726882753.56957: getting the remaining hosts for this loop 27844 1726882753.56958: done getting the remaining hosts for this loop 27844 1726882753.56961: getting the next task for host managed_node1 27844 1726882753.56970: done getting next task for host managed_node1 27844 1726882753.56973: ^ task is: TASK: Assert device is present 27844 1726882753.56974: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.56977: getting variables 27844 1726882753.56978: in VariableManager get_vars() 27844 1726882753.57002: Calling all_inventory to load vars for managed_node1 27844 1726882753.57003: Calling groups_inventory to load vars for managed_node1 27844 1726882753.57005: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.57016: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.57018: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.57020: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.57162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.57277: done with get_vars() 27844 1726882753.57283: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Friday 20 September 2024 21:39:13 -0400 (0:00:00.013) 0:00:12.649 ****** 27844 1726882753.57337: entering _queue_task() for managed_node1/include_tasks 27844 1726882753.57488: worker is 1 (out of 1 available) 27844 1726882753.57500: exiting _queue_task() for managed_node1/include_tasks 27844 1726882753.57512: done queuing things up, now waiting for results queue to drain 27844 1726882753.57514: waiting for pending results... 27844 1726882753.57643: running TaskExecutor() for managed_node1/TASK: Assert device is present 27844 1726882753.57703: in run() - task 0e448fcc-3ce9-efa9-466a-000000000012 27844 1726882753.57713: variable 'ansible_search_path' from source: unknown 27844 1726882753.57743: calling self._execute() 27844 1726882753.57811: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.57814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.57822: variable 'omit' from source: magic vars 27844 1726882753.58065: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.58080: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.58085: _execute() done 27844 1726882753.58088: dumping result to json 27844 1726882753.58091: done dumping result, returning 27844 1726882753.58097: done running TaskExecutor() for managed_node1/TASK: Assert device is present [0e448fcc-3ce9-efa9-466a-000000000012] 27844 1726882753.58107: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000012 27844 1726882753.58179: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000012 27844 1726882753.58182: WORKER PROCESS EXITING 27844 1726882753.58206: no more pending results, returning what we have 27844 1726882753.58217: in VariableManager get_vars() 27844 1726882753.58253: Calling all_inventory to load vars for managed_node1 27844 1726882753.58256: Calling groups_inventory to load vars for managed_node1 27844 1726882753.58258: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.58269: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.58271: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.58275: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.58383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.58495: done with get_vars() 27844 1726882753.58500: variable 'ansible_search_path' from source: unknown 27844 1726882753.58509: we have included files to process 27844 1726882753.58509: generating all_blocks data 27844 1726882753.58510: done generating all_blocks data 27844 1726882753.58513: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882753.58514: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882753.58516: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27844 1726882753.58582: in VariableManager get_vars() 27844 1726882753.58595: done with get_vars() 27844 1726882753.58664: done processing included file 27844 1726882753.58666: iterating over new_blocks loaded from include file 27844 1726882753.58668: in VariableManager get_vars() 27844 1726882753.58680: done with get_vars() 27844 1726882753.58681: filtering new block on tags 27844 1726882753.58692: done filtering new block on tags 27844 1726882753.58693: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 27844 1726882753.58696: extending task lists for all hosts with included blocks 27844 1726882753.59382: done extending task lists 27844 1726882753.59383: done processing included files 27844 1726882753.59383: results queue empty 27844 1726882753.59384: checking for any_errors_fatal 27844 1726882753.59385: done checking for any_errors_fatal 27844 1726882753.59386: checking for max_fail_percentage 27844 1726882753.59386: done checking for max_fail_percentage 27844 1726882753.59387: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.59388: done checking to see if all hosts have failed 27844 1726882753.59388: getting the remaining hosts for this loop 27844 1726882753.59389: done getting the remaining hosts for this loop 27844 1726882753.59390: getting the next task for host managed_node1 27844 1726882753.59392: done getting next task for host managed_node1 27844 1726882753.59394: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27844 1726882753.59395: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.59397: getting variables 27844 1726882753.59397: in VariableManager get_vars() 27844 1726882753.59407: Calling all_inventory to load vars for managed_node1 27844 1726882753.59408: Calling groups_inventory to load vars for managed_node1 27844 1726882753.59410: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.59415: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.59416: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.59418: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.59497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.59604: done with get_vars() 27844 1726882753.59610: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:39:13 -0400 (0:00:00.023) 0:00:12.673 ****** 27844 1726882753.59657: entering _queue_task() for managed_node1/include_tasks 27844 1726882753.59801: worker is 1 (out of 1 available) 27844 1726882753.59813: exiting _queue_task() for managed_node1/include_tasks 27844 1726882753.59824: done queuing things up, now waiting for results queue to drain 27844 1726882753.59825: waiting for pending results... 27844 1726882753.59959: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 27844 1726882753.60013: in run() - task 0e448fcc-3ce9-efa9-466a-0000000003eb 27844 1726882753.60022: variable 'ansible_search_path' from source: unknown 27844 1726882753.60025: variable 'ansible_search_path' from source: unknown 27844 1726882753.60051: calling self._execute() 27844 1726882753.60111: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.60115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.60123: variable 'omit' from source: magic vars 27844 1726882753.60378: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.60387: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.60393: _execute() done 27844 1726882753.60396: dumping result to json 27844 1726882753.60399: done dumping result, returning 27844 1726882753.60404: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-efa9-466a-0000000003eb] 27844 1726882753.60414: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003eb 27844 1726882753.60493: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003eb 27844 1726882753.60496: WORKER PROCESS EXITING 27844 1726882753.60526: no more pending results, returning what we have 27844 1726882753.60530: in VariableManager get_vars() 27844 1726882753.60567: Calling all_inventory to load vars for managed_node1 27844 1726882753.60570: Calling groups_inventory to load vars for managed_node1 27844 1726882753.60572: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.60579: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.60581: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.60582: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.60710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.60822: done with get_vars() 27844 1726882753.60827: variable 'ansible_search_path' from source: unknown 27844 1726882753.60828: variable 'ansible_search_path' from source: unknown 27844 1726882753.60852: we have included files to process 27844 1726882753.60853: generating all_blocks data 27844 1726882753.60854: done generating all_blocks data 27844 1726882753.60854: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882753.60855: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882753.60856: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882753.60965: done processing included file 27844 1726882753.60967: iterating over new_blocks loaded from include file 27844 1726882753.60968: in VariableManager get_vars() 27844 1726882753.60980: done with get_vars() 27844 1726882753.60981: filtering new block on tags 27844 1726882753.60990: done filtering new block on tags 27844 1726882753.60991: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 27844 1726882753.60994: extending task lists for all hosts with included blocks 27844 1726882753.61047: done extending task lists 27844 1726882753.61047: done processing included files 27844 1726882753.61048: results queue empty 27844 1726882753.61048: checking for any_errors_fatal 27844 1726882753.61050: done checking for any_errors_fatal 27844 1726882753.61051: checking for max_fail_percentage 27844 1726882753.61051: done checking for max_fail_percentage 27844 1726882753.61052: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.61052: done checking to see if all hosts have failed 27844 1726882753.61053: getting the remaining hosts for this loop 27844 1726882753.61054: done getting the remaining hosts for this loop 27844 1726882753.61056: getting the next task for host managed_node1 27844 1726882753.61059: done getting next task for host managed_node1 27844 1726882753.61060: ^ task is: TASK: Get stat for interface {{ interface }} 27844 1726882753.61062: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.61065: getting variables 27844 1726882753.61066: in VariableManager get_vars() 27844 1726882753.61076: Calling all_inventory to load vars for managed_node1 27844 1726882753.61077: Calling groups_inventory to load vars for managed_node1 27844 1726882753.61079: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.61082: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.61083: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.61085: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.61159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.61285: done with get_vars() 27844 1726882753.61292: done getting variables 27844 1726882753.61384: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:13 -0400 (0:00:00.017) 0:00:12.690 ****** 27844 1726882753.61404: entering _queue_task() for managed_node1/stat 27844 1726882753.61547: worker is 1 (out of 1 available) 27844 1726882753.61560: exiting _queue_task() for managed_node1/stat 27844 1726882753.61572: done queuing things up, now waiting for results queue to drain 27844 1726882753.61574: waiting for pending results... 27844 1726882753.61703: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest1 27844 1726882753.61760: in run() - task 0e448fcc-3ce9-efa9-466a-000000000483 27844 1726882753.61773: variable 'ansible_search_path' from source: unknown 27844 1726882753.61777: variable 'ansible_search_path' from source: unknown 27844 1726882753.61803: calling self._execute() 27844 1726882753.61858: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.61862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.61873: variable 'omit' from source: magic vars 27844 1726882753.62094: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.62103: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.62108: variable 'omit' from source: magic vars 27844 1726882753.62136: variable 'omit' from source: magic vars 27844 1726882753.62202: variable 'interface' from source: set_fact 27844 1726882753.62214: variable 'omit' from source: magic vars 27844 1726882753.62244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882753.62275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882753.62289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882753.62301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.62309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.62330: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882753.62334: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.62337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.62406: Set connection var ansible_shell_type to sh 27844 1726882753.62409: Set connection var ansible_connection to ssh 27844 1726882753.62413: Set connection var ansible_pipelining to False 27844 1726882753.62419: Set connection var ansible_timeout to 10 27844 1726882753.62424: Set connection var ansible_shell_executable to /bin/sh 27844 1726882753.62429: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882753.62448: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.62451: variable 'ansible_connection' from source: unknown 27844 1726882753.62454: variable 'ansible_module_compression' from source: unknown 27844 1726882753.62457: variable 'ansible_shell_type' from source: unknown 27844 1726882753.62459: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.62461: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.62468: variable 'ansible_pipelining' from source: unknown 27844 1726882753.62471: variable 'ansible_timeout' from source: unknown 27844 1726882753.62473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.62601: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882753.62609: variable 'omit' from source: magic vars 27844 1726882753.62614: starting attempt loop 27844 1726882753.62617: running the handler 27844 1726882753.62628: _low_level_execute_command(): starting 27844 1726882753.62635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882753.63141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.63161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.63180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.63223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.63251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.63351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.64977: stdout chunk (state=3): >>>/root <<< 27844 1726882753.65079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.65123: stderr chunk (state=3): >>><<< 27844 1726882753.65126: stdout chunk (state=3): >>><<< 27844 1726882753.65142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.65152: _low_level_execute_command(): starting 27844 1726882753.65161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931 `" && echo ansible-tmp-1726882753.65141-28552-238288123194931="` echo /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931 `" ) && sleep 0' 27844 1726882753.65578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882753.65592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.65615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.65630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.65674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.65686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.65788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.67641: stdout chunk (state=3): >>>ansible-tmp-1726882753.65141-28552-238288123194931=/root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931 <<< 27844 1726882753.67745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.67793: stderr chunk (state=3): >>><<< 27844 1726882753.67796: stdout chunk (state=3): >>><<< 27844 1726882753.67809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882753.65141-28552-238288123194931=/root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.67839: variable 'ansible_module_compression' from source: unknown 27844 1726882753.67891: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882753.67916: variable 'ansible_facts' from source: unknown 27844 1726882753.67983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/AnsiballZ_stat.py 27844 1726882753.68075: Sending initial data 27844 1726882753.68087: Sent initial data (151 bytes) 27844 1726882753.68718: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.68722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.68754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.68758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.68760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.68811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.68815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.68914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.70613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882753.70704: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882753.70796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp0ukqnv7k /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/AnsiballZ_stat.py <<< 27844 1726882753.70888: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882753.71898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.71998: stderr chunk (state=3): >>><<< 27844 1726882753.72001: stdout chunk (state=3): >>><<< 27844 1726882753.72017: done transferring module to remote 27844 1726882753.72028: _low_level_execute_command(): starting 27844 1726882753.72033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/ /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/AnsiballZ_stat.py && sleep 0' 27844 1726882753.72478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882753.72482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.72522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.72525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.72527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.72571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.72584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.72681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.74389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.74429: stderr chunk (state=3): >>><<< 27844 1726882753.74432: stdout chunk (state=3): >>><<< 27844 1726882753.74445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.74456: _low_level_execute_command(): starting 27844 1726882753.74461: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/AnsiballZ_stat.py && sleep 0' 27844 1726882753.74873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.74892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.74909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.74921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.74966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.74980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.75091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.88047: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29076, "dev": 21, "nlink": 1, "atime": 1726882752.439058, "mtime": 1726882752.439058, "ctime": 1726882752.439058, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882753.88939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882753.88987: stderr chunk (state=3): >>><<< 27844 1726882753.88991: stdout chunk (state=3): >>><<< 27844 1726882753.89005: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29076, "dev": 21, "nlink": 1, "atime": 1726882752.439058, "mtime": 1726882752.439058, "ctime": 1726882752.439058, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882753.89047: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882753.89054: _low_level_execute_command(): starting 27844 1726882753.89059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882753.65141-28552-238288123194931/ > /dev/null 2>&1 && sleep 0' 27844 1726882753.89487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.89490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882753.89518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882753.89521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882753.89523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882753.89571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882753.89586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882753.89684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882753.91477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882753.91545: stderr chunk (state=3): >>><<< 27844 1726882753.91551: stdout chunk (state=3): >>><<< 27844 1726882753.91571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882753.91878: handler run complete 27844 1726882753.91881: attempt loop complete, returning result 27844 1726882753.91884: _execute() done 27844 1726882753.91886: dumping result to json 27844 1726882753.91888: done dumping result, returning 27844 1726882753.91890: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest1 [0e448fcc-3ce9-efa9-466a-000000000483] 27844 1726882753.91892: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000483 27844 1726882753.91980: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000483 27844 1726882753.91983: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882752.439058, "block_size": 4096, "blocks": 0, "ctime": 1726882752.439058, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29076, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1726882752.439058, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27844 1726882753.92089: no more pending results, returning what we have 27844 1726882753.92093: results queue empty 27844 1726882753.92094: checking for any_errors_fatal 27844 1726882753.92096: done checking for any_errors_fatal 27844 1726882753.92096: checking for max_fail_percentage 27844 1726882753.92098: done checking for max_fail_percentage 27844 1726882753.92099: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.92100: done checking to see if all hosts have failed 27844 1726882753.92101: getting the remaining hosts for this loop 27844 1726882753.92102: done getting the remaining hosts for this loop 27844 1726882753.92106: getting the next task for host managed_node1 27844 1726882753.92113: done getting next task for host managed_node1 27844 1726882753.92116: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27844 1726882753.92119: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.92122: getting variables 27844 1726882753.92124: in VariableManager get_vars() 27844 1726882753.92170: Calling all_inventory to load vars for managed_node1 27844 1726882753.92173: Calling groups_inventory to load vars for managed_node1 27844 1726882753.92177: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.92190: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.92193: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.92196: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.92503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.92733: done with get_vars() 27844 1726882753.92745: done getting variables 27844 1726882753.92808: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882753.92936: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:39:13 -0400 (0:00:00.315) 0:00:13.006 ****** 27844 1726882753.92969: entering _queue_task() for managed_node1/assert 27844 1726882753.93219: worker is 1 (out of 1 available) 27844 1726882753.93231: exiting _queue_task() for managed_node1/assert 27844 1726882753.93243: done queuing things up, now waiting for results queue to drain 27844 1726882753.93245: waiting for pending results... 27844 1726882753.93507: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest1' 27844 1726882753.93607: in run() - task 0e448fcc-3ce9-efa9-466a-0000000003ec 27844 1726882753.93626: variable 'ansible_search_path' from source: unknown 27844 1726882753.93633: variable 'ansible_search_path' from source: unknown 27844 1726882753.93678: calling self._execute() 27844 1726882753.93771: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.93783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.93803: variable 'omit' from source: magic vars 27844 1726882753.94153: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.94174: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.94185: variable 'omit' from source: magic vars 27844 1726882753.94227: variable 'omit' from source: magic vars 27844 1726882753.94329: variable 'interface' from source: set_fact 27844 1726882753.94361: variable 'omit' from source: magic vars 27844 1726882753.94406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882753.94524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882753.94549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882753.94580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.94596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882753.94628: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882753.94638: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.94646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.94753: Set connection var ansible_shell_type to sh 27844 1726882753.94760: Set connection var ansible_connection to ssh 27844 1726882753.94773: Set connection var ansible_pipelining to False 27844 1726882753.94789: Set connection var ansible_timeout to 10 27844 1726882753.94798: Set connection var ansible_shell_executable to /bin/sh 27844 1726882753.94806: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882753.94835: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.94842: variable 'ansible_connection' from source: unknown 27844 1726882753.94848: variable 'ansible_module_compression' from source: unknown 27844 1726882753.94854: variable 'ansible_shell_type' from source: unknown 27844 1726882753.94860: variable 'ansible_shell_executable' from source: unknown 27844 1726882753.94869: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.94876: variable 'ansible_pipelining' from source: unknown 27844 1726882753.94882: variable 'ansible_timeout' from source: unknown 27844 1726882753.94894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.95027: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882753.95042: variable 'omit' from source: magic vars 27844 1726882753.95051: starting attempt loop 27844 1726882753.95057: running the handler 27844 1726882753.95195: variable 'interface_stat' from source: set_fact 27844 1726882753.95224: Evaluated conditional (interface_stat.stat.exists): True 27844 1726882753.95235: handler run complete 27844 1726882753.95255: attempt loop complete, returning result 27844 1726882753.95262: _execute() done 27844 1726882753.95272: dumping result to json 27844 1726882753.95280: done dumping result, returning 27844 1726882753.95291: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'ethtest1' [0e448fcc-3ce9-efa9-466a-0000000003ec] 27844 1726882753.95300: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ec 27844 1726882753.95402: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000003ec 27844 1726882753.95409: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882753.95480: no more pending results, returning what we have 27844 1726882753.95484: results queue empty 27844 1726882753.95485: checking for any_errors_fatal 27844 1726882753.95494: done checking for any_errors_fatal 27844 1726882753.95495: checking for max_fail_percentage 27844 1726882753.95497: done checking for max_fail_percentage 27844 1726882753.95498: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.95499: done checking to see if all hosts have failed 27844 1726882753.95500: getting the remaining hosts for this loop 27844 1726882753.95501: done getting the remaining hosts for this loop 27844 1726882753.95505: getting the next task for host managed_node1 27844 1726882753.95513: done getting next task for host managed_node1 27844 1726882753.95519: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882753.95522: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.95541: getting variables 27844 1726882753.95543: in VariableManager get_vars() 27844 1726882753.95585: Calling all_inventory to load vars for managed_node1 27844 1726882753.95587: Calling groups_inventory to load vars for managed_node1 27844 1726882753.95590: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.95600: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.95603: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.95609: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.95803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.95931: done with get_vars() 27844 1726882753.95937: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:13 -0400 (0:00:00.030) 0:00:13.036 ****** 27844 1726882753.96004: entering _queue_task() for managed_node1/include_tasks 27844 1726882753.96171: worker is 1 (out of 1 available) 27844 1726882753.96182: exiting _queue_task() for managed_node1/include_tasks 27844 1726882753.96193: done queuing things up, now waiting for results queue to drain 27844 1726882753.96195: waiting for pending results... 27844 1726882753.96353: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882753.96426: in run() - task 0e448fcc-3ce9-efa9-466a-00000000001b 27844 1726882753.96437: variable 'ansible_search_path' from source: unknown 27844 1726882753.96440: variable 'ansible_search_path' from source: unknown 27844 1726882753.96470: calling self._execute() 27844 1726882753.96536: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.96540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.96548: variable 'omit' from source: magic vars 27844 1726882753.96791: variable 'ansible_distribution_major_version' from source: facts 27844 1726882753.96800: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882753.96805: _execute() done 27844 1726882753.96808: dumping result to json 27844 1726882753.96816: done dumping result, returning 27844 1726882753.96819: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-efa9-466a-00000000001b] 27844 1726882753.96824: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001b 27844 1726882753.96904: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001b 27844 1726882753.96907: WORKER PROCESS EXITING 27844 1726882753.96945: no more pending results, returning what we have 27844 1726882753.96950: in VariableManager get_vars() 27844 1726882753.96993: Calling all_inventory to load vars for managed_node1 27844 1726882753.96995: Calling groups_inventory to load vars for managed_node1 27844 1726882753.96998: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.97006: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.97008: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.97011: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.97121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.97236: done with get_vars() 27844 1726882753.97241: variable 'ansible_search_path' from source: unknown 27844 1726882753.97242: variable 'ansible_search_path' from source: unknown 27844 1726882753.97284: we have included files to process 27844 1726882753.97285: generating all_blocks data 27844 1726882753.97287: done generating all_blocks data 27844 1726882753.97291: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882753.97292: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882753.97293: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882753.98025: done processing included file 27844 1726882753.98027: iterating over new_blocks loaded from include file 27844 1726882753.98028: in VariableManager get_vars() 27844 1726882753.98051: done with get_vars() 27844 1726882753.98053: filtering new block on tags 27844 1726882753.98074: done filtering new block on tags 27844 1726882753.98076: in VariableManager get_vars() 27844 1726882753.98099: done with get_vars() 27844 1726882753.98100: filtering new block on tags 27844 1726882753.98120: done filtering new block on tags 27844 1726882753.98122: in VariableManager get_vars() 27844 1726882753.98145: done with get_vars() 27844 1726882753.98147: filtering new block on tags 27844 1726882753.98169: done filtering new block on tags 27844 1726882753.98171: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 27844 1726882753.98176: extending task lists for all hosts with included blocks 27844 1726882753.98860: done extending task lists 27844 1726882753.98861: done processing included files 27844 1726882753.98862: results queue empty 27844 1726882753.98862: checking for any_errors_fatal 27844 1726882753.98866: done checking for any_errors_fatal 27844 1726882753.98867: checking for max_fail_percentage 27844 1726882753.98867: done checking for max_fail_percentage 27844 1726882753.98868: checking to see if all hosts have failed and the running result is not ok 27844 1726882753.98868: done checking to see if all hosts have failed 27844 1726882753.98869: getting the remaining hosts for this loop 27844 1726882753.98870: done getting the remaining hosts for this loop 27844 1726882753.98871: getting the next task for host managed_node1 27844 1726882753.98874: done getting next task for host managed_node1 27844 1726882753.98876: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882753.98878: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882753.98884: getting variables 27844 1726882753.98885: in VariableManager get_vars() 27844 1726882753.98897: Calling all_inventory to load vars for managed_node1 27844 1726882753.98898: Calling groups_inventory to load vars for managed_node1 27844 1726882753.98899: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882753.98902: Calling all_plugins_play to load vars for managed_node1 27844 1726882753.98904: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882753.98906: Calling groups_plugins_play to load vars for managed_node1 27844 1726882753.99001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882753.99115: done with get_vars() 27844 1726882753.99121: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:13 -0400 (0:00:00.031) 0:00:13.068 ****** 27844 1726882753.99169: entering _queue_task() for managed_node1/setup 27844 1726882753.99353: worker is 1 (out of 1 available) 27844 1726882753.99367: exiting _queue_task() for managed_node1/setup 27844 1726882753.99379: done queuing things up, now waiting for results queue to drain 27844 1726882753.99380: waiting for pending results... 27844 1726882753.99543: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882753.99633: in run() - task 0e448fcc-3ce9-efa9-466a-00000000049b 27844 1726882753.99644: variable 'ansible_search_path' from source: unknown 27844 1726882753.99651: variable 'ansible_search_path' from source: unknown 27844 1726882753.99687: calling self._execute() 27844 1726882753.99744: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882753.99747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882753.99756: variable 'omit' from source: magic vars 27844 1726882754.00018: variable 'ansible_distribution_major_version' from source: facts 27844 1726882754.00028: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882754.00169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882754.02244: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882754.02762: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882754.02791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882754.02815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882754.02837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882754.02909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882754.02924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882754.02943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882754.02975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882754.02986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882754.03021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882754.03036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882754.03056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882754.03087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882754.03097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882754.03199: variable '__network_required_facts' from source: role '' defaults 27844 1726882754.03206: variable 'ansible_facts' from source: unknown 27844 1726882754.03261: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27844 1726882754.03268: when evaluation is False, skipping this task 27844 1726882754.03271: _execute() done 27844 1726882754.03274: dumping result to json 27844 1726882754.03276: done dumping result, returning 27844 1726882754.03280: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-efa9-466a-00000000049b] 27844 1726882754.03282: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049b 27844 1726882754.03359: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049b 27844 1726882754.03362: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882754.03409: no more pending results, returning what we have 27844 1726882754.03413: results queue empty 27844 1726882754.03414: checking for any_errors_fatal 27844 1726882754.03415: done checking for any_errors_fatal 27844 1726882754.03416: checking for max_fail_percentage 27844 1726882754.03418: done checking for max_fail_percentage 27844 1726882754.03418: checking to see if all hosts have failed and the running result is not ok 27844 1726882754.03419: done checking to see if all hosts have failed 27844 1726882754.03420: getting the remaining hosts for this loop 27844 1726882754.03421: done getting the remaining hosts for this loop 27844 1726882754.03425: getting the next task for host managed_node1 27844 1726882754.03432: done getting next task for host managed_node1 27844 1726882754.03436: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882754.03440: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882754.03452: getting variables 27844 1726882754.03453: in VariableManager get_vars() 27844 1726882754.03501: Calling all_inventory to load vars for managed_node1 27844 1726882754.03504: Calling groups_inventory to load vars for managed_node1 27844 1726882754.03506: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882754.03515: Calling all_plugins_play to load vars for managed_node1 27844 1726882754.03517: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882754.03520: Calling groups_plugins_play to load vars for managed_node1 27844 1726882754.03642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882754.03826: done with get_vars() 27844 1726882754.03837: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:14 -0400 (0:00:00.047) 0:00:13.115 ****** 27844 1726882754.03936: entering _queue_task() for managed_node1/stat 27844 1726882754.04175: worker is 1 (out of 1 available) 27844 1726882754.04187: exiting _queue_task() for managed_node1/stat 27844 1726882754.04199: done queuing things up, now waiting for results queue to drain 27844 1726882754.04201: waiting for pending results... 27844 1726882754.04474: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882754.04744: in run() - task 0e448fcc-3ce9-efa9-466a-00000000049d 27844 1726882754.04771: variable 'ansible_search_path' from source: unknown 27844 1726882754.04815: variable 'ansible_search_path' from source: unknown 27844 1726882754.04857: calling self._execute() 27844 1726882754.05815: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882754.05818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882754.05832: variable 'omit' from source: magic vars 27844 1726882754.06138: variable 'ansible_distribution_major_version' from source: facts 27844 1726882754.06170: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882754.06343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882754.06717: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882754.06772: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882754.06810: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882754.06856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882754.06950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882754.06986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882754.07017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882754.07063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882754.07165: variable '__network_is_ostree' from source: set_fact 27844 1726882754.07182: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882754.07191: when evaluation is False, skipping this task 27844 1726882754.07198: _execute() done 27844 1726882754.07206: dumping result to json 27844 1726882754.07214: done dumping result, returning 27844 1726882754.07225: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-efa9-466a-00000000049d] 27844 1726882754.07235: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049d 27844 1726882754.07345: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049d 27844 1726882754.07355: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882754.07409: no more pending results, returning what we have 27844 1726882754.07413: results queue empty 27844 1726882754.07414: checking for any_errors_fatal 27844 1726882754.07420: done checking for any_errors_fatal 27844 1726882754.07420: checking for max_fail_percentage 27844 1726882754.07422: done checking for max_fail_percentage 27844 1726882754.07423: checking to see if all hosts have failed and the running result is not ok 27844 1726882754.07424: done checking to see if all hosts have failed 27844 1726882754.07424: getting the remaining hosts for this loop 27844 1726882754.07426: done getting the remaining hosts for this loop 27844 1726882754.07429: getting the next task for host managed_node1 27844 1726882754.07434: done getting next task for host managed_node1 27844 1726882754.07437: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882754.07441: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882754.07460: getting variables 27844 1726882754.07463: in VariableManager get_vars() 27844 1726882754.07510: Calling all_inventory to load vars for managed_node1 27844 1726882754.07512: Calling groups_inventory to load vars for managed_node1 27844 1726882754.07515: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882754.07525: Calling all_plugins_play to load vars for managed_node1 27844 1726882754.07527: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882754.07529: Calling groups_plugins_play to load vars for managed_node1 27844 1726882754.07748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882754.07961: done with get_vars() 27844 1726882754.07975: done getting variables 27844 1726882754.08027: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:14 -0400 (0:00:00.041) 0:00:13.157 ****** 27844 1726882754.08060: entering _queue_task() for managed_node1/set_fact 27844 1726882754.08289: worker is 1 (out of 1 available) 27844 1726882754.08302: exiting _queue_task() for managed_node1/set_fact 27844 1726882754.08314: done queuing things up, now waiting for results queue to drain 27844 1726882754.08316: waiting for pending results... 27844 1726882754.08580: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882754.08720: in run() - task 0e448fcc-3ce9-efa9-466a-00000000049e 27844 1726882754.08739: variable 'ansible_search_path' from source: unknown 27844 1726882754.08746: variable 'ansible_search_path' from source: unknown 27844 1726882754.08791: calling self._execute() 27844 1726882754.08874: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882754.08884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882754.08897: variable 'omit' from source: magic vars 27844 1726882754.09462: variable 'ansible_distribution_major_version' from source: facts 27844 1726882754.09485: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882754.09640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882754.09914: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882754.09959: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882754.10005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882754.10043: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882754.10131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882754.10159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882754.10194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882754.10229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882754.10318: variable '__network_is_ostree' from source: set_fact 27844 1726882754.10333: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882754.10340: when evaluation is False, skipping this task 27844 1726882754.10346: _execute() done 27844 1726882754.10352: dumping result to json 27844 1726882754.10358: done dumping result, returning 27844 1726882754.10373: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-efa9-466a-00000000049e] 27844 1726882754.10385: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049e skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882754.10517: no more pending results, returning what we have 27844 1726882754.10522: results queue empty 27844 1726882754.10524: checking for any_errors_fatal 27844 1726882754.10528: done checking for any_errors_fatal 27844 1726882754.10529: checking for max_fail_percentage 27844 1726882754.10531: done checking for max_fail_percentage 27844 1726882754.10532: checking to see if all hosts have failed and the running result is not ok 27844 1726882754.10532: done checking to see if all hosts have failed 27844 1726882754.10533: getting the remaining hosts for this loop 27844 1726882754.10535: done getting the remaining hosts for this loop 27844 1726882754.10538: getting the next task for host managed_node1 27844 1726882754.10544: done getting next task for host managed_node1 27844 1726882754.10548: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882754.10552: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882754.10562: getting variables 27844 1726882754.10568: in VariableManager get_vars() 27844 1726882754.10605: Calling all_inventory to load vars for managed_node1 27844 1726882754.10607: Calling groups_inventory to load vars for managed_node1 27844 1726882754.10609: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882754.10619: Calling all_plugins_play to load vars for managed_node1 27844 1726882754.10622: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882754.10625: Calling groups_plugins_play to load vars for managed_node1 27844 1726882754.10800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882754.11001: done with get_vars() 27844 1726882754.11011: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:14 -0400 (0:00:00.030) 0:00:13.187 ****** 27844 1726882754.11126: entering _queue_task() for managed_node1/service_facts 27844 1726882754.11129: Creating lock for service_facts 27844 1726882754.11174: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000049e 27844 1726882754.11184: WORKER PROCESS EXITING 27844 1726882754.11568: worker is 1 (out of 1 available) 27844 1726882754.11579: exiting _queue_task() for managed_node1/service_facts 27844 1726882754.11590: done queuing things up, now waiting for results queue to drain 27844 1726882754.11591: waiting for pending results... 27844 1726882754.11849: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882754.11985: in run() - task 0e448fcc-3ce9-efa9-466a-0000000004a0 27844 1726882754.12003: variable 'ansible_search_path' from source: unknown 27844 1726882754.12010: variable 'ansible_search_path' from source: unknown 27844 1726882754.12052: calling self._execute() 27844 1726882754.12135: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882754.12149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882754.12163: variable 'omit' from source: magic vars 27844 1726882754.12626: variable 'ansible_distribution_major_version' from source: facts 27844 1726882754.12642: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882754.12652: variable 'omit' from source: magic vars 27844 1726882754.12759: variable 'omit' from source: magic vars 27844 1726882754.12799: variable 'omit' from source: magic vars 27844 1726882754.12859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882754.12898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882754.12921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882754.12956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882754.12975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882754.13004: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882754.13014: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882754.13021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882754.13123: Set connection var ansible_shell_type to sh 27844 1726882754.13131: Set connection var ansible_connection to ssh 27844 1726882754.13140: Set connection var ansible_pipelining to False 27844 1726882754.13148: Set connection var ansible_timeout to 10 27844 1726882754.13160: Set connection var ansible_shell_executable to /bin/sh 27844 1726882754.13174: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882754.13202: variable 'ansible_shell_executable' from source: unknown 27844 1726882754.13209: variable 'ansible_connection' from source: unknown 27844 1726882754.13215: variable 'ansible_module_compression' from source: unknown 27844 1726882754.13220: variable 'ansible_shell_type' from source: unknown 27844 1726882754.13226: variable 'ansible_shell_executable' from source: unknown 27844 1726882754.13231: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882754.13238: variable 'ansible_pipelining' from source: unknown 27844 1726882754.13244: variable 'ansible_timeout' from source: unknown 27844 1726882754.13251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882754.13435: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882754.13450: variable 'omit' from source: magic vars 27844 1726882754.13458: starting attempt loop 27844 1726882754.13468: running the handler 27844 1726882754.13489: _low_level_execute_command(): starting 27844 1726882754.13500: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882754.14207: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882754.14222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.14238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.14256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.14306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.14319: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882754.14333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.14357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882754.14375: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882754.14387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882754.14400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.14415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.14432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.14445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.14457: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882754.14481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.14553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882754.14580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882754.14594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882754.14721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882754.16397: stdout chunk (state=3): >>>/root <<< 27844 1726882754.16575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882754.16578: stdout chunk (state=3): >>><<< 27844 1726882754.16580: stderr chunk (state=3): >>><<< 27844 1726882754.16672: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882754.16676: _low_level_execute_command(): starting 27844 1726882754.16679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162 `" && echo ansible-tmp-1726882754.1659863-28574-157035075948162="` echo /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162 `" ) && sleep 0' 27844 1726882754.17257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882754.17277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.17294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.17313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.17362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.17380: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882754.17395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.17412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882754.17424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882754.17434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882754.17447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.17462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.17491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.17504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.17516: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882754.17531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.17629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882754.17650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882754.17679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882754.17825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882754.19721: stdout chunk (state=3): >>>ansible-tmp-1726882754.1659863-28574-157035075948162=/root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162 <<< 27844 1726882754.19845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882754.19878: stderr chunk (state=3): >>><<< 27844 1726882754.19881: stdout chunk (state=3): >>><<< 27844 1726882754.19895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882754.1659863-28574-157035075948162=/root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882754.19928: variable 'ansible_module_compression' from source: unknown 27844 1726882754.19960: ANSIBALLZ: Using lock for service_facts 27844 1726882754.19966: ANSIBALLZ: Acquiring lock 27844 1726882754.19968: ANSIBALLZ: Lock acquired: 139916602719456 27844 1726882754.19976: ANSIBALLZ: Creating module 27844 1726882754.31104: ANSIBALLZ: Writing module into payload 27844 1726882754.31223: ANSIBALLZ: Writing module 27844 1726882754.31246: ANSIBALLZ: Renaming module 27844 1726882754.31249: ANSIBALLZ: Done creating module 27844 1726882754.31271: variable 'ansible_facts' from source: unknown 27844 1726882754.31339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/AnsiballZ_service_facts.py 27844 1726882754.31585: Sending initial data 27844 1726882754.31589: Sent initial data (162 bytes) 27844 1726882754.32502: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.32506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.32660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.32665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.32683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.32690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.32883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882754.32897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882754.32902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882754.33031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882754.34954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882754.34982: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882754.35082: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpkirgtqfx /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/AnsiballZ_service_facts.py <<< 27844 1726882754.35173: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882754.36851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882754.36871: stderr chunk (state=3): >>><<< 27844 1726882754.36874: stdout chunk (state=3): >>><<< 27844 1726882754.36888: done transferring module to remote 27844 1726882754.36903: _low_level_execute_command(): starting 27844 1726882754.36906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/ /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/AnsiballZ_service_facts.py && sleep 0' 27844 1726882754.37553: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882754.37576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.37580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.37588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.37637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.37640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.37642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.37704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882754.37711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882754.37819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882754.39600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882754.39652: stderr chunk (state=3): >>><<< 27844 1726882754.39655: stdout chunk (state=3): >>><<< 27844 1726882754.39663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882754.39670: _low_level_execute_command(): starting 27844 1726882754.39672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/AnsiballZ_service_facts.py && sleep 0' 27844 1726882754.40219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882754.40233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.40238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.40250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.40283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.40290: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882754.40309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.40312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882754.40318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882754.40325: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882754.40332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882754.40341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882754.40354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882754.40359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882754.40370: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882754.40379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882754.40447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882754.40473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882754.40477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882754.40596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882755.75511: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 27844 1726882755.75569: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27844 1726882755.76932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882755.76936: stdout chunk (state=3): >>><<< 27844 1726882755.76943: stderr chunk (state=3): >>><<< 27844 1726882755.76979: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882755.77950: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882755.77974: _low_level_execute_command(): starting 27844 1726882755.77989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882754.1659863-28574-157035075948162/ > /dev/null 2>&1 && sleep 0' 27844 1726882755.78640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882755.78653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882755.78671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882755.78690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882755.78733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882755.78745: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882755.78758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.78778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882755.78790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882755.78802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882755.78813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882755.78824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882755.78843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882755.78855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882755.78868: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882755.78884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.78968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882755.78972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882755.79075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882755.80975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882755.80995: stderr chunk (state=3): >>><<< 27844 1726882755.80999: stdout chunk (state=3): >>><<< 27844 1726882755.81270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882755.81273: handler run complete 27844 1726882755.81276: variable 'ansible_facts' from source: unknown 27844 1726882755.81353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882755.81819: variable 'ansible_facts' from source: unknown 27844 1726882755.81899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882755.82016: attempt loop complete, returning result 27844 1726882755.82019: _execute() done 27844 1726882755.82024: dumping result to json 27844 1726882755.82054: done dumping result, returning 27844 1726882755.82064: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-efa9-466a-0000000004a0] 27844 1726882755.82071: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000004a0 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882755.82896: no more pending results, returning what we have 27844 1726882755.82899: results queue empty 27844 1726882755.82900: checking for any_errors_fatal 27844 1726882755.82903: done checking for any_errors_fatal 27844 1726882755.82904: checking for max_fail_percentage 27844 1726882755.82905: done checking for max_fail_percentage 27844 1726882755.82906: checking to see if all hosts have failed and the running result is not ok 27844 1726882755.82907: done checking to see if all hosts have failed 27844 1726882755.82908: getting the remaining hosts for this loop 27844 1726882755.82909: done getting the remaining hosts for this loop 27844 1726882755.82914: getting the next task for host managed_node1 27844 1726882755.82918: done getting next task for host managed_node1 27844 1726882755.82921: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882755.82925: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882755.82933: getting variables 27844 1726882755.82935: in VariableManager get_vars() 27844 1726882755.82969: Calling all_inventory to load vars for managed_node1 27844 1726882755.82973: Calling groups_inventory to load vars for managed_node1 27844 1726882755.82975: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882755.82983: Calling all_plugins_play to load vars for managed_node1 27844 1726882755.82986: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882755.82989: Calling groups_plugins_play to load vars for managed_node1 27844 1726882755.83326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882755.83822: done with get_vars() 27844 1726882755.83835: done getting variables 27844 1726882755.83870: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000004a0 27844 1726882755.83873: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:15 -0400 (0:00:01.728) 0:00:14.916 ****** 27844 1726882755.83947: entering _queue_task() for managed_node1/package_facts 27844 1726882755.83949: Creating lock for package_facts 27844 1726882755.84218: worker is 1 (out of 1 available) 27844 1726882755.84230: exiting _queue_task() for managed_node1/package_facts 27844 1726882755.84248: done queuing things up, now waiting for results queue to drain 27844 1726882755.84250: waiting for pending results... 27844 1726882755.84542: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882755.84670: in run() - task 0e448fcc-3ce9-efa9-466a-0000000004a1 27844 1726882755.84697: variable 'ansible_search_path' from source: unknown 27844 1726882755.84703: variable 'ansible_search_path' from source: unknown 27844 1726882755.84734: calling self._execute() 27844 1726882755.84805: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882755.84814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882755.84822: variable 'omit' from source: magic vars 27844 1726882755.85121: variable 'ansible_distribution_major_version' from source: facts 27844 1726882755.85138: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882755.85144: variable 'omit' from source: magic vars 27844 1726882755.85192: variable 'omit' from source: magic vars 27844 1726882755.85213: variable 'omit' from source: magic vars 27844 1726882755.85251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882755.85280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882755.85295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882755.85307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882755.85316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882755.85345: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882755.85349: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882755.85352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882755.85421: Set connection var ansible_shell_type to sh 27844 1726882755.85424: Set connection var ansible_connection to ssh 27844 1726882755.85427: Set connection var ansible_pipelining to False 27844 1726882755.85433: Set connection var ansible_timeout to 10 27844 1726882755.85438: Set connection var ansible_shell_executable to /bin/sh 27844 1726882755.85443: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882755.85471: variable 'ansible_shell_executable' from source: unknown 27844 1726882755.85477: variable 'ansible_connection' from source: unknown 27844 1726882755.85480: variable 'ansible_module_compression' from source: unknown 27844 1726882755.85483: variable 'ansible_shell_type' from source: unknown 27844 1726882755.85485: variable 'ansible_shell_executable' from source: unknown 27844 1726882755.85487: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882755.85491: variable 'ansible_pipelining' from source: unknown 27844 1726882755.85493: variable 'ansible_timeout' from source: unknown 27844 1726882755.85498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882755.85640: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882755.85649: variable 'omit' from source: magic vars 27844 1726882755.85652: starting attempt loop 27844 1726882755.85655: running the handler 27844 1726882755.85677: _low_level_execute_command(): starting 27844 1726882755.85685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882755.86163: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882755.86181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882755.86194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.86208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882755.86226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.86262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882755.86278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882755.86294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882755.86390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882755.88025: stdout chunk (state=3): >>>/root <<< 27844 1726882755.88177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882755.88213: stderr chunk (state=3): >>><<< 27844 1726882755.88223: stdout chunk (state=3): >>><<< 27844 1726882755.88248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882755.88280: _low_level_execute_command(): starting 27844 1726882755.88294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306 `" && echo ansible-tmp-1726882755.8825438-28654-80852991825306="` echo /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306 `" ) && sleep 0' 27844 1726882755.88904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882755.88917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882755.88939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882755.88957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882755.88999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882755.89019: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882755.89046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.89069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882755.89072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882755.89128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882755.89135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882755.89256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882755.91116: stdout chunk (state=3): >>>ansible-tmp-1726882755.8825438-28654-80852991825306=/root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306 <<< 27844 1726882755.91227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882755.91284: stderr chunk (state=3): >>><<< 27844 1726882755.91287: stdout chunk (state=3): >>><<< 27844 1726882755.91374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882755.8825438-28654-80852991825306=/root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882755.91378: variable 'ansible_module_compression' from source: unknown 27844 1726882755.91471: ANSIBALLZ: Using lock for package_facts 27844 1726882755.91476: ANSIBALLZ: Acquiring lock 27844 1726882755.91479: ANSIBALLZ: Lock acquired: 139916600843280 27844 1726882755.91481: ANSIBALLZ: Creating module 27844 1726882756.17718: ANSIBALLZ: Writing module into payload 27844 1726882756.17885: ANSIBALLZ: Writing module 27844 1726882756.17914: ANSIBALLZ: Renaming module 27844 1726882756.17919: ANSIBALLZ: Done creating module 27844 1726882756.17956: variable 'ansible_facts' from source: unknown 27844 1726882756.18140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/AnsiballZ_package_facts.py 27844 1726882756.18298: Sending initial data 27844 1726882756.18301: Sent initial data (161 bytes) 27844 1726882756.19284: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882756.19294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882756.19305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882756.19319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882756.19357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882756.19363: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882756.19379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.19392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882756.19400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882756.19407: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882756.19414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882756.19424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882756.19435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882756.19443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882756.19449: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882756.19458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.19535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882756.19555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882756.19568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882756.19702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882756.21533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882756.21639: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882756.21732: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpqcsmefso /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/AnsiballZ_package_facts.py <<< 27844 1726882756.21825: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882756.23921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882756.24082: stderr chunk (state=3): >>><<< 27844 1726882756.24085: stdout chunk (state=3): >>><<< 27844 1726882756.24088: done transferring module to remote 27844 1726882756.24090: _low_level_execute_command(): starting 27844 1726882756.24094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/ /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/AnsiballZ_package_facts.py && sleep 0' 27844 1726882756.24582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882756.24588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882756.24628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.24635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882756.24652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882756.24657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.24730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882756.24733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882756.24748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882756.24863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882756.26599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882756.26640: stderr chunk (state=3): >>><<< 27844 1726882756.26643: stdout chunk (state=3): >>><<< 27844 1726882756.26654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882756.26657: _low_level_execute_command(): starting 27844 1726882756.26661: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/AnsiballZ_package_facts.py && sleep 0' 27844 1726882756.27059: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882756.27069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882756.27109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.27112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882756.27114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.27179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882756.27184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882756.27199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882756.27312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882756.72916: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 27844 1726882756.72954: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 27844 1726882756.72972: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 27844 1726882756.72992: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 27844 1726882756.73020: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 27844 1726882756.73035: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 27844 1726882756.73074: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 27844 1726882756.73079: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 27844 1726882756.73084: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 27844 1726882756.73087: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 27844 1726882756.73108: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 27844 1726882756.73127: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 27844 1726882756.73150: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 27844 1726882756.73154: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27844 1726882756.74654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882756.74717: stderr chunk (state=3): >>><<< 27844 1726882756.74720: stdout chunk (state=3): >>><<< 27844 1726882756.74757: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882756.78868: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882756.78886: _low_level_execute_command(): starting 27844 1726882756.78889: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882755.8825438-28654-80852991825306/ > /dev/null 2>&1 && sleep 0' 27844 1726882756.79353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882756.79357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882756.79398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.79402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882756.79411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882756.79421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882756.79467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882756.79483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882756.79494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882756.79603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882756.81434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882756.81482: stderr chunk (state=3): >>><<< 27844 1726882756.81485: stdout chunk (state=3): >>><<< 27844 1726882756.81497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882756.81502: handler run complete 27844 1726882756.81993: variable 'ansible_facts' from source: unknown 27844 1726882756.82256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.83476: variable 'ansible_facts' from source: unknown 27844 1726882756.83728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.84167: attempt loop complete, returning result 27844 1726882756.84176: _execute() done 27844 1726882756.84179: dumping result to json 27844 1726882756.84305: done dumping result, returning 27844 1726882756.84310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-efa9-466a-0000000004a1] 27844 1726882756.84315: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000004a1 27844 1726882756.85709: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000004a1 27844 1726882756.85712: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882756.85750: no more pending results, returning what we have 27844 1726882756.85752: results queue empty 27844 1726882756.85753: checking for any_errors_fatal 27844 1726882756.85756: done checking for any_errors_fatal 27844 1726882756.85756: checking for max_fail_percentage 27844 1726882756.85758: done checking for max_fail_percentage 27844 1726882756.85758: checking to see if all hosts have failed and the running result is not ok 27844 1726882756.85759: done checking to see if all hosts have failed 27844 1726882756.85759: getting the remaining hosts for this loop 27844 1726882756.85760: done getting the remaining hosts for this loop 27844 1726882756.85763: getting the next task for host managed_node1 27844 1726882756.85769: done getting next task for host managed_node1 27844 1726882756.85771: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882756.85773: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882756.85780: getting variables 27844 1726882756.85781: in VariableManager get_vars() 27844 1726882756.85809: Calling all_inventory to load vars for managed_node1 27844 1726882756.85810: Calling groups_inventory to load vars for managed_node1 27844 1726882756.85812: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882756.85819: Calling all_plugins_play to load vars for managed_node1 27844 1726882756.85820: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882756.85822: Calling groups_plugins_play to load vars for managed_node1 27844 1726882756.86565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.87490: done with get_vars() 27844 1726882756.87505: done getting variables 27844 1726882756.87550: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:16 -0400 (0:00:01.036) 0:00:15.952 ****** 27844 1726882756.87579: entering _queue_task() for managed_node1/debug 27844 1726882756.87779: worker is 1 (out of 1 available) 27844 1726882756.87793: exiting _queue_task() for managed_node1/debug 27844 1726882756.87806: done queuing things up, now waiting for results queue to drain 27844 1726882756.87807: waiting for pending results... 27844 1726882756.87985: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882756.88068: in run() - task 0e448fcc-3ce9-efa9-466a-00000000001c 27844 1726882756.88084: variable 'ansible_search_path' from source: unknown 27844 1726882756.88088: variable 'ansible_search_path' from source: unknown 27844 1726882756.88116: calling self._execute() 27844 1726882756.88190: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.88194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.88200: variable 'omit' from source: magic vars 27844 1726882756.88468: variable 'ansible_distribution_major_version' from source: facts 27844 1726882756.88476: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882756.88482: variable 'omit' from source: magic vars 27844 1726882756.88518: variable 'omit' from source: magic vars 27844 1726882756.88583: variable 'network_provider' from source: set_fact 27844 1726882756.88600: variable 'omit' from source: magic vars 27844 1726882756.88632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882756.88658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882756.88677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882756.88691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882756.88701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882756.88721: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882756.88728: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.88735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.88802: Set connection var ansible_shell_type to sh 27844 1726882756.88806: Set connection var ansible_connection to ssh 27844 1726882756.88811: Set connection var ansible_pipelining to False 27844 1726882756.88816: Set connection var ansible_timeout to 10 27844 1726882756.88821: Set connection var ansible_shell_executable to /bin/sh 27844 1726882756.88827: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882756.88848: variable 'ansible_shell_executable' from source: unknown 27844 1726882756.88851: variable 'ansible_connection' from source: unknown 27844 1726882756.88854: variable 'ansible_module_compression' from source: unknown 27844 1726882756.88856: variable 'ansible_shell_type' from source: unknown 27844 1726882756.88859: variable 'ansible_shell_executable' from source: unknown 27844 1726882756.88862: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.88867: variable 'ansible_pipelining' from source: unknown 27844 1726882756.88870: variable 'ansible_timeout' from source: unknown 27844 1726882756.88872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.88968: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882756.88975: variable 'omit' from source: magic vars 27844 1726882756.88979: starting attempt loop 27844 1726882756.88982: running the handler 27844 1726882756.89016: handler run complete 27844 1726882756.89030: attempt loop complete, returning result 27844 1726882756.89033: _execute() done 27844 1726882756.89035: dumping result to json 27844 1726882756.89038: done dumping result, returning 27844 1726882756.89044: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-efa9-466a-00000000001c] 27844 1726882756.89050: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001c 27844 1726882756.89129: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001c 27844 1726882756.89132: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 27844 1726882756.89191: no more pending results, returning what we have 27844 1726882756.89195: results queue empty 27844 1726882756.89196: checking for any_errors_fatal 27844 1726882756.89206: done checking for any_errors_fatal 27844 1726882756.89206: checking for max_fail_percentage 27844 1726882756.89208: done checking for max_fail_percentage 27844 1726882756.89209: checking to see if all hosts have failed and the running result is not ok 27844 1726882756.89209: done checking to see if all hosts have failed 27844 1726882756.89210: getting the remaining hosts for this loop 27844 1726882756.89212: done getting the remaining hosts for this loop 27844 1726882756.89215: getting the next task for host managed_node1 27844 1726882756.89220: done getting next task for host managed_node1 27844 1726882756.89223: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882756.89226: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882756.89235: getting variables 27844 1726882756.89236: in VariableManager get_vars() 27844 1726882756.89274: Calling all_inventory to load vars for managed_node1 27844 1726882756.89277: Calling groups_inventory to load vars for managed_node1 27844 1726882756.89279: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882756.89287: Calling all_plugins_play to load vars for managed_node1 27844 1726882756.89290: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882756.89292: Calling groups_plugins_play to load vars for managed_node1 27844 1726882756.90032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.91034: done with get_vars() 27844 1726882756.91049: done getting variables 27844 1726882756.91090: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:16 -0400 (0:00:00.035) 0:00:15.987 ****** 27844 1726882756.91113: entering _queue_task() for managed_node1/fail 27844 1726882756.91288: worker is 1 (out of 1 available) 27844 1726882756.91302: exiting _queue_task() for managed_node1/fail 27844 1726882756.91315: done queuing things up, now waiting for results queue to drain 27844 1726882756.91317: waiting for pending results... 27844 1726882756.91489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882756.91573: in run() - task 0e448fcc-3ce9-efa9-466a-00000000001d 27844 1726882756.91584: variable 'ansible_search_path' from source: unknown 27844 1726882756.91588: variable 'ansible_search_path' from source: unknown 27844 1726882756.91615: calling self._execute() 27844 1726882756.91680: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.91683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.91691: variable 'omit' from source: magic vars 27844 1726882756.91942: variable 'ansible_distribution_major_version' from source: facts 27844 1726882756.91952: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882756.92037: variable 'network_state' from source: role '' defaults 27844 1726882756.92045: Evaluated conditional (network_state != {}): False 27844 1726882756.92048: when evaluation is False, skipping this task 27844 1726882756.92051: _execute() done 27844 1726882756.92054: dumping result to json 27844 1726882756.92056: done dumping result, returning 27844 1726882756.92062: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-efa9-466a-00000000001d] 27844 1726882756.92071: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001d 27844 1726882756.92150: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001d 27844 1726882756.92154: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882756.92204: no more pending results, returning what we have 27844 1726882756.92208: results queue empty 27844 1726882756.92209: checking for any_errors_fatal 27844 1726882756.92213: done checking for any_errors_fatal 27844 1726882756.92214: checking for max_fail_percentage 27844 1726882756.92215: done checking for max_fail_percentage 27844 1726882756.92216: checking to see if all hosts have failed and the running result is not ok 27844 1726882756.92217: done checking to see if all hosts have failed 27844 1726882756.92218: getting the remaining hosts for this loop 27844 1726882756.92219: done getting the remaining hosts for this loop 27844 1726882756.92221: getting the next task for host managed_node1 27844 1726882756.92226: done getting next task for host managed_node1 27844 1726882756.92229: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882756.92232: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882756.92244: getting variables 27844 1726882756.92245: in VariableManager get_vars() 27844 1726882756.92283: Calling all_inventory to load vars for managed_node1 27844 1726882756.92285: Calling groups_inventory to load vars for managed_node1 27844 1726882756.92286: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882756.92293: Calling all_plugins_play to load vars for managed_node1 27844 1726882756.92294: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882756.92296: Calling groups_plugins_play to load vars for managed_node1 27844 1726882756.93037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.93962: done with get_vars() 27844 1726882756.93978: done getting variables 27844 1726882756.94018: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:16 -0400 (0:00:00.029) 0:00:16.017 ****** 27844 1726882756.94039: entering _queue_task() for managed_node1/fail 27844 1726882756.94205: worker is 1 (out of 1 available) 27844 1726882756.94220: exiting _queue_task() for managed_node1/fail 27844 1726882756.94231: done queuing things up, now waiting for results queue to drain 27844 1726882756.94233: waiting for pending results... 27844 1726882756.94391: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882756.94466: in run() - task 0e448fcc-3ce9-efa9-466a-00000000001e 27844 1726882756.94478: variable 'ansible_search_path' from source: unknown 27844 1726882756.94482: variable 'ansible_search_path' from source: unknown 27844 1726882756.94508: calling self._execute() 27844 1726882756.94574: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.94578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.94585: variable 'omit' from source: magic vars 27844 1726882756.94828: variable 'ansible_distribution_major_version' from source: facts 27844 1726882756.94837: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882756.94918: variable 'network_state' from source: role '' defaults 27844 1726882756.94926: Evaluated conditional (network_state != {}): False 27844 1726882756.94929: when evaluation is False, skipping this task 27844 1726882756.94932: _execute() done 27844 1726882756.94934: dumping result to json 27844 1726882756.94937: done dumping result, returning 27844 1726882756.94944: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-efa9-466a-00000000001e] 27844 1726882756.94947: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001e 27844 1726882756.95033: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001e 27844 1726882756.95036: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882756.95084: no more pending results, returning what we have 27844 1726882756.95088: results queue empty 27844 1726882756.95089: checking for any_errors_fatal 27844 1726882756.95094: done checking for any_errors_fatal 27844 1726882756.95094: checking for max_fail_percentage 27844 1726882756.95096: done checking for max_fail_percentage 27844 1726882756.95096: checking to see if all hosts have failed and the running result is not ok 27844 1726882756.95097: done checking to see if all hosts have failed 27844 1726882756.95098: getting the remaining hosts for this loop 27844 1726882756.95099: done getting the remaining hosts for this loop 27844 1726882756.95102: getting the next task for host managed_node1 27844 1726882756.95106: done getting next task for host managed_node1 27844 1726882756.95110: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882756.95112: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882756.95123: getting variables 27844 1726882756.95124: in VariableManager get_vars() 27844 1726882756.95155: Calling all_inventory to load vars for managed_node1 27844 1726882756.95156: Calling groups_inventory to load vars for managed_node1 27844 1726882756.95158: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882756.95166: Calling all_plugins_play to load vars for managed_node1 27844 1726882756.95168: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882756.95170: Calling groups_plugins_play to load vars for managed_node1 27844 1726882756.95996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882756.96899: done with get_vars() 27844 1726882756.96913: done getting variables 27844 1726882756.96950: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:16 -0400 (0:00:00.029) 0:00:16.046 ****** 27844 1726882756.96973: entering _queue_task() for managed_node1/fail 27844 1726882756.97205: worker is 1 (out of 1 available) 27844 1726882756.97216: exiting _queue_task() for managed_node1/fail 27844 1726882756.97227: done queuing things up, now waiting for results queue to drain 27844 1726882756.97228: waiting for pending results... 27844 1726882756.97513: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882756.97640: in run() - task 0e448fcc-3ce9-efa9-466a-00000000001f 27844 1726882756.97657: variable 'ansible_search_path' from source: unknown 27844 1726882756.97669: variable 'ansible_search_path' from source: unknown 27844 1726882756.97714: calling self._execute() 27844 1726882756.97806: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882756.97816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882756.97829: variable 'omit' from source: magic vars 27844 1726882756.98198: variable 'ansible_distribution_major_version' from source: facts 27844 1726882756.98217: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882756.98385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882756.99978: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.00026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.00054: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.00083: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.00103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.00158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.00184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.00202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.00228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.00238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.00337: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.00354: Evaluated conditional (ansible_distribution_major_version | int > 9): False 27844 1726882757.00361: when evaluation is False, skipping this task 27844 1726882757.00373: _execute() done 27844 1726882757.00381: dumping result to json 27844 1726882757.00390: done dumping result, returning 27844 1726882757.00401: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-efa9-466a-00000000001f] 27844 1726882757.00411: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001f 27844 1726882757.00509: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000001f 27844 1726882757.00517: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 27844 1726882757.00610: no more pending results, returning what we have 27844 1726882757.00614: results queue empty 27844 1726882757.00615: checking for any_errors_fatal 27844 1726882757.00624: done checking for any_errors_fatal 27844 1726882757.00625: checking for max_fail_percentage 27844 1726882757.00627: done checking for max_fail_percentage 27844 1726882757.00628: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.00628: done checking to see if all hosts have failed 27844 1726882757.00629: getting the remaining hosts for this loop 27844 1726882757.00630: done getting the remaining hosts for this loop 27844 1726882757.00634: getting the next task for host managed_node1 27844 1726882757.00638: done getting next task for host managed_node1 27844 1726882757.00642: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882757.00645: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.00658: getting variables 27844 1726882757.00660: in VariableManager get_vars() 27844 1726882757.01106: Calling all_inventory to load vars for managed_node1 27844 1726882757.01109: Calling groups_inventory to load vars for managed_node1 27844 1726882757.01111: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.01119: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.01122: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.01125: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.02491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.04243: done with get_vars() 27844 1726882757.04268: done getting variables 27844 1726882757.04361: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:17 -0400 (0:00:00.074) 0:00:16.120 ****** 27844 1726882757.04396: entering _queue_task() for managed_node1/dnf 27844 1726882757.04641: worker is 1 (out of 1 available) 27844 1726882757.04652: exiting _queue_task() for managed_node1/dnf 27844 1726882757.04662: done queuing things up, now waiting for results queue to drain 27844 1726882757.04668: waiting for pending results... 27844 1726882757.04950: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882757.05095: in run() - task 0e448fcc-3ce9-efa9-466a-000000000020 27844 1726882757.05120: variable 'ansible_search_path' from source: unknown 27844 1726882757.05128: variable 'ansible_search_path' from source: unknown 27844 1726882757.05176: calling self._execute() 27844 1726882757.05273: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.05285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.05300: variable 'omit' from source: magic vars 27844 1726882757.05684: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.05703: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.05915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.12230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.12312: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.12352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.12398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.12428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.12504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.12536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.12571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.12619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.12638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.12753: variable 'ansible_distribution' from source: facts 27844 1726882757.12756: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.12768: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27844 1726882757.12867: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.12951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.12969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.13000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.13024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.13033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.13062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.13083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.13099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.13123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.13135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.13164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.13187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.13203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.13227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.13237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.13340: variable 'network_connections' from source: task vars 27844 1726882757.13348: variable 'interface0' from source: play vars 27844 1726882757.13397: variable 'interface0' from source: play vars 27844 1726882757.13402: variable 'interface0' from source: play vars 27844 1726882757.13444: variable 'interface0' from source: play vars 27844 1726882757.13453: variable 'interface1' from source: play vars 27844 1726882757.13500: variable 'interface1' from source: play vars 27844 1726882757.13505: variable 'interface1' from source: play vars 27844 1726882757.13547: variable 'interface1' from source: play vars 27844 1726882757.13598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882757.13712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882757.13738: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882757.13759: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882757.13781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882757.13815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882757.13830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882757.13847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.13868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882757.13902: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882757.14043: variable 'network_connections' from source: task vars 27844 1726882757.14047: variable 'interface0' from source: play vars 27844 1726882757.14091: variable 'interface0' from source: play vars 27844 1726882757.14097: variable 'interface0' from source: play vars 27844 1726882757.14139: variable 'interface0' from source: play vars 27844 1726882757.14148: variable 'interface1' from source: play vars 27844 1726882757.14191: variable 'interface1' from source: play vars 27844 1726882757.14196: variable 'interface1' from source: play vars 27844 1726882757.14236: variable 'interface1' from source: play vars 27844 1726882757.14268: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882757.14272: when evaluation is False, skipping this task 27844 1726882757.14275: _execute() done 27844 1726882757.14277: dumping result to json 27844 1726882757.14281: done dumping result, returning 27844 1726882757.14287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000020] 27844 1726882757.14291: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882757.14417: no more pending results, returning what we have 27844 1726882757.14420: results queue empty 27844 1726882757.14421: checking for any_errors_fatal 27844 1726882757.14449: done checking for any_errors_fatal 27844 1726882757.14451: checking for max_fail_percentage 27844 1726882757.14453: done checking for max_fail_percentage 27844 1726882757.14454: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.14455: done checking to see if all hosts have failed 27844 1726882757.14455: getting the remaining hosts for this loop 27844 1726882757.14457: done getting the remaining hosts for this loop 27844 1726882757.14461: getting the next task for host managed_node1 27844 1726882757.14468: done getting next task for host managed_node1 27844 1726882757.14473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882757.14475: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.14488: getting variables 27844 1726882757.14489: in VariableManager get_vars() 27844 1726882757.14722: Calling all_inventory to load vars for managed_node1 27844 1726882757.14725: Calling groups_inventory to load vars for managed_node1 27844 1726882757.14728: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.14739: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.14742: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.14745: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.15409: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000020 27844 1726882757.15413: WORKER PROCESS EXITING 27844 1726882757.18568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.19995: done with get_vars() 27844 1726882757.20010: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882757.20054: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:17 -0400 (0:00:00.156) 0:00:16.277 ****** 27844 1726882757.20078: entering _queue_task() for managed_node1/yum 27844 1726882757.20079: Creating lock for yum 27844 1726882757.20300: worker is 1 (out of 1 available) 27844 1726882757.20315: exiting _queue_task() for managed_node1/yum 27844 1726882757.20326: done queuing things up, now waiting for results queue to drain 27844 1726882757.20329: waiting for pending results... 27844 1726882757.20502: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882757.20591: in run() - task 0e448fcc-3ce9-efa9-466a-000000000021 27844 1726882757.20603: variable 'ansible_search_path' from source: unknown 27844 1726882757.20608: variable 'ansible_search_path' from source: unknown 27844 1726882757.20636: calling self._execute() 27844 1726882757.20712: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.20717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.20725: variable 'omit' from source: magic vars 27844 1726882757.20999: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.21015: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.21135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.23391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.23469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.23509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.23548: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.23582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.23659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.23697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.23729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.23782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.23804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.23902: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.23922: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27844 1726882757.23932: when evaluation is False, skipping this task 27844 1726882757.23940: _execute() done 27844 1726882757.23946: dumping result to json 27844 1726882757.23953: done dumping result, returning 27844 1726882757.23963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000021] 27844 1726882757.23976: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27844 1726882757.24123: no more pending results, returning what we have 27844 1726882757.24127: results queue empty 27844 1726882757.24128: checking for any_errors_fatal 27844 1726882757.24136: done checking for any_errors_fatal 27844 1726882757.24137: checking for max_fail_percentage 27844 1726882757.24139: done checking for max_fail_percentage 27844 1726882757.24140: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.24141: done checking to see if all hosts have failed 27844 1726882757.24141: getting the remaining hosts for this loop 27844 1726882757.24143: done getting the remaining hosts for this loop 27844 1726882757.24146: getting the next task for host managed_node1 27844 1726882757.24153: done getting next task for host managed_node1 27844 1726882757.24157: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882757.24160: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.24178: getting variables 27844 1726882757.24181: in VariableManager get_vars() 27844 1726882757.24223: Calling all_inventory to load vars for managed_node1 27844 1726882757.24226: Calling groups_inventory to load vars for managed_node1 27844 1726882757.24229: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.24240: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.24243: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.24246: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.25115: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000021 27844 1726882757.25119: WORKER PROCESS EXITING 27844 1726882757.25445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.26387: done with get_vars() 27844 1726882757.26412: done getting variables 27844 1726882757.26468: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:17 -0400 (0:00:00.064) 0:00:16.341 ****** 27844 1726882757.26498: entering _queue_task() for managed_node1/fail 27844 1726882757.26760: worker is 1 (out of 1 available) 27844 1726882757.26779: exiting _queue_task() for managed_node1/fail 27844 1726882757.26794: done queuing things up, now waiting for results queue to drain 27844 1726882757.26796: waiting for pending results... 27844 1726882757.27072: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882757.27200: in run() - task 0e448fcc-3ce9-efa9-466a-000000000022 27844 1726882757.27220: variable 'ansible_search_path' from source: unknown 27844 1726882757.27228: variable 'ansible_search_path' from source: unknown 27844 1726882757.27278: calling self._execute() 27844 1726882757.27382: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.27394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.27410: variable 'omit' from source: magic vars 27844 1726882757.27761: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.27775: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.27855: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.27995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.29538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.29595: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.29620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.29646: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.29668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.29725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.29746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.29768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.29797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.29807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.29839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.29855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.29878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.29906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.29916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.29943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.29960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.29980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.30008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.30018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.30132: variable 'network_connections' from source: task vars 27844 1726882757.30141: variable 'interface0' from source: play vars 27844 1726882757.30194: variable 'interface0' from source: play vars 27844 1726882757.30201: variable 'interface0' from source: play vars 27844 1726882757.30245: variable 'interface0' from source: play vars 27844 1726882757.30255: variable 'interface1' from source: play vars 27844 1726882757.30301: variable 'interface1' from source: play vars 27844 1726882757.30307: variable 'interface1' from source: play vars 27844 1726882757.30350: variable 'interface1' from source: play vars 27844 1726882757.30403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882757.30522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882757.30549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882757.30575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882757.30596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882757.30627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882757.30643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882757.30668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.30688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882757.30732: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882757.30888: variable 'network_connections' from source: task vars 27844 1726882757.30892: variable 'interface0' from source: play vars 27844 1726882757.30934: variable 'interface0' from source: play vars 27844 1726882757.30939: variable 'interface0' from source: play vars 27844 1726882757.30987: variable 'interface0' from source: play vars 27844 1726882757.30996: variable 'interface1' from source: play vars 27844 1726882757.31039: variable 'interface1' from source: play vars 27844 1726882757.31050: variable 'interface1' from source: play vars 27844 1726882757.31091: variable 'interface1' from source: play vars 27844 1726882757.31118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882757.31121: when evaluation is False, skipping this task 27844 1726882757.31123: _execute() done 27844 1726882757.31126: dumping result to json 27844 1726882757.31128: done dumping result, returning 27844 1726882757.31134: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000022] 27844 1726882757.31139: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000022 27844 1726882757.31225: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000022 27844 1726882757.31228: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882757.31303: no more pending results, returning what we have 27844 1726882757.31307: results queue empty 27844 1726882757.31308: checking for any_errors_fatal 27844 1726882757.31313: done checking for any_errors_fatal 27844 1726882757.31314: checking for max_fail_percentage 27844 1726882757.31315: done checking for max_fail_percentage 27844 1726882757.31316: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.31317: done checking to see if all hosts have failed 27844 1726882757.31318: getting the remaining hosts for this loop 27844 1726882757.31319: done getting the remaining hosts for this loop 27844 1726882757.31322: getting the next task for host managed_node1 27844 1726882757.31327: done getting next task for host managed_node1 27844 1726882757.31330: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27844 1726882757.31333: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.31346: getting variables 27844 1726882757.31347: in VariableManager get_vars() 27844 1726882757.31385: Calling all_inventory to load vars for managed_node1 27844 1726882757.31388: Calling groups_inventory to load vars for managed_node1 27844 1726882757.31390: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.31398: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.31400: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.31402: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.32184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.33128: done with get_vars() 27844 1726882757.33144: done getting variables 27844 1726882757.33188: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:17 -0400 (0:00:00.067) 0:00:16.408 ****** 27844 1726882757.33210: entering _queue_task() for managed_node1/package 27844 1726882757.33395: worker is 1 (out of 1 available) 27844 1726882757.33408: exiting _queue_task() for managed_node1/package 27844 1726882757.33420: done queuing things up, now waiting for results queue to drain 27844 1726882757.33422: waiting for pending results... 27844 1726882757.33581: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 27844 1726882757.33661: in run() - task 0e448fcc-3ce9-efa9-466a-000000000023 27844 1726882757.33675: variable 'ansible_search_path' from source: unknown 27844 1726882757.33679: variable 'ansible_search_path' from source: unknown 27844 1726882757.33710: calling self._execute() 27844 1726882757.33780: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.33784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.33792: variable 'omit' from source: magic vars 27844 1726882757.34047: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.34056: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.34186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882757.34367: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882757.34396: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882757.34422: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882757.34461: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882757.34547: variable 'network_packages' from source: role '' defaults 27844 1726882757.34622: variable '__network_provider_setup' from source: role '' defaults 27844 1726882757.34632: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882757.34687: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882757.34694: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882757.34736: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882757.34850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.36416: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.36462: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.36493: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.36516: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.36540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.36598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.36617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.36634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.36669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.36680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.36712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.36727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.36746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.36777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.36787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.36930: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882757.37002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.37019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.37037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.37068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.37079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.37136: variable 'ansible_python' from source: facts 27844 1726882757.37155: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882757.37214: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882757.37270: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882757.37348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.37369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.37385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.37413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.37423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.37454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.37477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.37493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.37522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.37532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.37627: variable 'network_connections' from source: task vars 27844 1726882757.37632: variable 'interface0' from source: play vars 27844 1726882757.37701: variable 'interface0' from source: play vars 27844 1726882757.37709: variable 'interface0' from source: play vars 27844 1726882757.37777: variable 'interface0' from source: play vars 27844 1726882757.37788: variable 'interface1' from source: play vars 27844 1726882757.37856: variable 'interface1' from source: play vars 27844 1726882757.37864: variable 'interface1' from source: play vars 27844 1726882757.37931: variable 'interface1' from source: play vars 27844 1726882757.37986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882757.38005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882757.38027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.38049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882757.38088: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.38267: variable 'network_connections' from source: task vars 27844 1726882757.38276: variable 'interface0' from source: play vars 27844 1726882757.38337: variable 'interface0' from source: play vars 27844 1726882757.38342: variable 'interface0' from source: play vars 27844 1726882757.38417: variable 'interface0' from source: play vars 27844 1726882757.38427: variable 'interface1' from source: play vars 27844 1726882757.38500: variable 'interface1' from source: play vars 27844 1726882757.38507: variable 'interface1' from source: play vars 27844 1726882757.38576: variable 'interface1' from source: play vars 27844 1726882757.38617: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882757.38673: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.38871: variable 'network_connections' from source: task vars 27844 1726882757.38874: variable 'interface0' from source: play vars 27844 1726882757.38923: variable 'interface0' from source: play vars 27844 1726882757.38927: variable 'interface0' from source: play vars 27844 1726882757.38998: variable 'interface0' from source: play vars 27844 1726882757.39001: variable 'interface1' from source: play vars 27844 1726882757.39022: variable 'interface1' from source: play vars 27844 1726882757.39032: variable 'interface1' from source: play vars 27844 1726882757.39076: variable 'interface1' from source: play vars 27844 1726882757.39096: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882757.39151: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882757.39344: variable 'network_connections' from source: task vars 27844 1726882757.39348: variable 'interface0' from source: play vars 27844 1726882757.39393: variable 'interface0' from source: play vars 27844 1726882757.39398: variable 'interface0' from source: play vars 27844 1726882757.39442: variable 'interface0' from source: play vars 27844 1726882757.39455: variable 'interface1' from source: play vars 27844 1726882757.39501: variable 'interface1' from source: play vars 27844 1726882757.39506: variable 'interface1' from source: play vars 27844 1726882757.39550: variable 'interface1' from source: play vars 27844 1726882757.39599: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882757.39639: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882757.39647: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882757.39691: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882757.39826: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882757.40127: variable 'network_connections' from source: task vars 27844 1726882757.40130: variable 'interface0' from source: play vars 27844 1726882757.40174: variable 'interface0' from source: play vars 27844 1726882757.40180: variable 'interface0' from source: play vars 27844 1726882757.40224: variable 'interface0' from source: play vars 27844 1726882757.40232: variable 'interface1' from source: play vars 27844 1726882757.40276: variable 'interface1' from source: play vars 27844 1726882757.40281: variable 'interface1' from source: play vars 27844 1726882757.40324: variable 'interface1' from source: play vars 27844 1726882757.40335: variable 'ansible_distribution' from source: facts 27844 1726882757.40338: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.40343: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.40360: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882757.40470: variable 'ansible_distribution' from source: facts 27844 1726882757.40473: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.40476: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.40486: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882757.40594: variable 'ansible_distribution' from source: facts 27844 1726882757.40597: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.40600: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.40626: variable 'network_provider' from source: set_fact 27844 1726882757.40638: variable 'ansible_facts' from source: unknown 27844 1726882757.41155: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27844 1726882757.41166: when evaluation is False, skipping this task 27844 1726882757.41173: _execute() done 27844 1726882757.41180: dumping result to json 27844 1726882757.41188: done dumping result, returning 27844 1726882757.41199: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-efa9-466a-000000000023] 27844 1726882757.41208: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000023 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27844 1726882757.41366: no more pending results, returning what we have 27844 1726882757.41371: results queue empty 27844 1726882757.41373: checking for any_errors_fatal 27844 1726882757.41382: done checking for any_errors_fatal 27844 1726882757.41383: checking for max_fail_percentage 27844 1726882757.41389: done checking for max_fail_percentage 27844 1726882757.41390: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.41391: done checking to see if all hosts have failed 27844 1726882757.41392: getting the remaining hosts for this loop 27844 1726882757.41393: done getting the remaining hosts for this loop 27844 1726882757.41397: getting the next task for host managed_node1 27844 1726882757.41404: done getting next task for host managed_node1 27844 1726882757.41409: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882757.41412: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.41428: getting variables 27844 1726882757.41429: in VariableManager get_vars() 27844 1726882757.41478: Calling all_inventory to load vars for managed_node1 27844 1726882757.41481: Calling groups_inventory to load vars for managed_node1 27844 1726882757.41484: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.41495: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.41498: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.41501: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.42670: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000023 27844 1726882757.42675: WORKER PROCESS EXITING 27844 1726882757.43333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.44508: done with get_vars() 27844 1726882757.44523: done getting variables 27844 1726882757.44568: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:17 -0400 (0:00:00.113) 0:00:16.522 ****** 27844 1726882757.44592: entering _queue_task() for managed_node1/package 27844 1726882757.44786: worker is 1 (out of 1 available) 27844 1726882757.44798: exiting _queue_task() for managed_node1/package 27844 1726882757.44810: done queuing things up, now waiting for results queue to drain 27844 1726882757.44811: waiting for pending results... 27844 1726882757.44982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882757.45077: in run() - task 0e448fcc-3ce9-efa9-466a-000000000024 27844 1726882757.45088: variable 'ansible_search_path' from source: unknown 27844 1726882757.45092: variable 'ansible_search_path' from source: unknown 27844 1726882757.45126: calling self._execute() 27844 1726882757.45205: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.45214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.45222: variable 'omit' from source: magic vars 27844 1726882757.45500: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.45510: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.45596: variable 'network_state' from source: role '' defaults 27844 1726882757.45606: Evaluated conditional (network_state != {}): False 27844 1726882757.45610: when evaluation is False, skipping this task 27844 1726882757.45612: _execute() done 27844 1726882757.45615: dumping result to json 27844 1726882757.45617: done dumping result, returning 27844 1726882757.45623: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-000000000024] 27844 1726882757.45628: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000024 27844 1726882757.45721: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000024 27844 1726882757.45724: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882757.45782: no more pending results, returning what we have 27844 1726882757.45786: results queue empty 27844 1726882757.45787: checking for any_errors_fatal 27844 1726882757.45790: done checking for any_errors_fatal 27844 1726882757.45791: checking for max_fail_percentage 27844 1726882757.45792: done checking for max_fail_percentage 27844 1726882757.45793: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.45794: done checking to see if all hosts have failed 27844 1726882757.45795: getting the remaining hosts for this loop 27844 1726882757.45796: done getting the remaining hosts for this loop 27844 1726882757.45799: getting the next task for host managed_node1 27844 1726882757.45804: done getting next task for host managed_node1 27844 1726882757.45807: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882757.45810: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.45823: getting variables 27844 1726882757.45824: in VariableManager get_vars() 27844 1726882757.45860: Calling all_inventory to load vars for managed_node1 27844 1726882757.45863: Calling groups_inventory to load vars for managed_node1 27844 1726882757.45867: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.45876: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.45877: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.45879: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.46632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.47657: done with get_vars() 27844 1726882757.47675: done getting variables 27844 1726882757.47717: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:17 -0400 (0:00:00.031) 0:00:16.553 ****** 27844 1726882757.47738: entering _queue_task() for managed_node1/package 27844 1726882757.47910: worker is 1 (out of 1 available) 27844 1726882757.47922: exiting _queue_task() for managed_node1/package 27844 1726882757.47934: done queuing things up, now waiting for results queue to drain 27844 1726882757.47935: waiting for pending results... 27844 1726882757.48115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882757.48195: in run() - task 0e448fcc-3ce9-efa9-466a-000000000025 27844 1726882757.48206: variable 'ansible_search_path' from source: unknown 27844 1726882757.48210: variable 'ansible_search_path' from source: unknown 27844 1726882757.48244: calling self._execute() 27844 1726882757.48318: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.48322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.48330: variable 'omit' from source: magic vars 27844 1726882757.48592: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.48603: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.48685: variable 'network_state' from source: role '' defaults 27844 1726882757.48694: Evaluated conditional (network_state != {}): False 27844 1726882757.48697: when evaluation is False, skipping this task 27844 1726882757.48700: _execute() done 27844 1726882757.48703: dumping result to json 27844 1726882757.48705: done dumping result, returning 27844 1726882757.48711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-000000000025] 27844 1726882757.48719: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000025 27844 1726882757.48806: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000025 27844 1726882757.48809: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882757.48859: no more pending results, returning what we have 27844 1726882757.48862: results queue empty 27844 1726882757.48867: checking for any_errors_fatal 27844 1726882757.48874: done checking for any_errors_fatal 27844 1726882757.48874: checking for max_fail_percentage 27844 1726882757.48876: done checking for max_fail_percentage 27844 1726882757.48877: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.48877: done checking to see if all hosts have failed 27844 1726882757.48878: getting the remaining hosts for this loop 27844 1726882757.48879: done getting the remaining hosts for this loop 27844 1726882757.48881: getting the next task for host managed_node1 27844 1726882757.48886: done getting next task for host managed_node1 27844 1726882757.48890: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882757.48892: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.48904: getting variables 27844 1726882757.48906: in VariableManager get_vars() 27844 1726882757.48939: Calling all_inventory to load vars for managed_node1 27844 1726882757.48941: Calling groups_inventory to load vars for managed_node1 27844 1726882757.48944: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.48952: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.48954: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.48956: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.49704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.50633: done with get_vars() 27844 1726882757.50648: done getting variables 27844 1726882757.50719: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:17 -0400 (0:00:00.030) 0:00:16.584 ****** 27844 1726882757.50741: entering _queue_task() for managed_node1/service 27844 1726882757.50743: Creating lock for service 27844 1726882757.50920: worker is 1 (out of 1 available) 27844 1726882757.50933: exiting _queue_task() for managed_node1/service 27844 1726882757.50943: done queuing things up, now waiting for results queue to drain 27844 1726882757.50945: waiting for pending results... 27844 1726882757.51101: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882757.51179: in run() - task 0e448fcc-3ce9-efa9-466a-000000000026 27844 1726882757.51191: variable 'ansible_search_path' from source: unknown 27844 1726882757.51194: variable 'ansible_search_path' from source: unknown 27844 1726882757.51222: calling self._execute() 27844 1726882757.51292: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.51295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.51303: variable 'omit' from source: magic vars 27844 1726882757.51574: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.51584: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.51663: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.51794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.53332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.53593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.53619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.53644: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.53662: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.53723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.53743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.53760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.53794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.53805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.53835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.53851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.53871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.53903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.53914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.53940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.53956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.53976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.54004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.54015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.54120: variable 'network_connections' from source: task vars 27844 1726882757.54132: variable 'interface0' from source: play vars 27844 1726882757.54182: variable 'interface0' from source: play vars 27844 1726882757.54189: variable 'interface0' from source: play vars 27844 1726882757.54237: variable 'interface0' from source: play vars 27844 1726882757.54247: variable 'interface1' from source: play vars 27844 1726882757.54292: variable 'interface1' from source: play vars 27844 1726882757.54297: variable 'interface1' from source: play vars 27844 1726882757.54341: variable 'interface1' from source: play vars 27844 1726882757.54390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882757.54496: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882757.54522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882757.54544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882757.54579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882757.54609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882757.54624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882757.54642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.54659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882757.54720: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882757.54864: variable 'network_connections' from source: task vars 27844 1726882757.54877: variable 'interface0' from source: play vars 27844 1726882757.54916: variable 'interface0' from source: play vars 27844 1726882757.54922: variable 'interface0' from source: play vars 27844 1726882757.54962: variable 'interface0' from source: play vars 27844 1726882757.54979: variable 'interface1' from source: play vars 27844 1726882757.55019: variable 'interface1' from source: play vars 27844 1726882757.55030: variable 'interface1' from source: play vars 27844 1726882757.55068: variable 'interface1' from source: play vars 27844 1726882757.55097: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882757.55104: when evaluation is False, skipping this task 27844 1726882757.55106: _execute() done 27844 1726882757.55109: dumping result to json 27844 1726882757.55112: done dumping result, returning 27844 1726882757.55114: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000026] 27844 1726882757.55118: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000026 27844 1726882757.55203: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000026 27844 1726882757.55210: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882757.55252: no more pending results, returning what we have 27844 1726882757.55255: results queue empty 27844 1726882757.55256: checking for any_errors_fatal 27844 1726882757.55265: done checking for any_errors_fatal 27844 1726882757.55266: checking for max_fail_percentage 27844 1726882757.55267: done checking for max_fail_percentage 27844 1726882757.55268: checking to see if all hosts have failed and the running result is not ok 27844 1726882757.55269: done checking to see if all hosts have failed 27844 1726882757.55270: getting the remaining hosts for this loop 27844 1726882757.55271: done getting the remaining hosts for this loop 27844 1726882757.55274: getting the next task for host managed_node1 27844 1726882757.55279: done getting next task for host managed_node1 27844 1726882757.55283: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882757.55286: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882757.55303: getting variables 27844 1726882757.55305: in VariableManager get_vars() 27844 1726882757.55343: Calling all_inventory to load vars for managed_node1 27844 1726882757.55346: Calling groups_inventory to load vars for managed_node1 27844 1726882757.55348: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882757.55356: Calling all_plugins_play to load vars for managed_node1 27844 1726882757.55358: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882757.55360: Calling groups_plugins_play to load vars for managed_node1 27844 1726882757.56224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882757.57133: done with get_vars() 27844 1726882757.57149: done getting variables 27844 1726882757.57189: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:17 -0400 (0:00:00.064) 0:00:16.648 ****** 27844 1726882757.57210: entering _queue_task() for managed_node1/service 27844 1726882757.57380: worker is 1 (out of 1 available) 27844 1726882757.57394: exiting _queue_task() for managed_node1/service 27844 1726882757.57406: done queuing things up, now waiting for results queue to drain 27844 1726882757.57407: waiting for pending results... 27844 1726882757.57570: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882757.57654: in run() - task 0e448fcc-3ce9-efa9-466a-000000000027 27844 1726882757.57667: variable 'ansible_search_path' from source: unknown 27844 1726882757.57672: variable 'ansible_search_path' from source: unknown 27844 1726882757.57703: calling self._execute() 27844 1726882757.57772: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.57776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.57784: variable 'omit' from source: magic vars 27844 1726882757.58035: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.58045: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882757.58153: variable 'network_provider' from source: set_fact 27844 1726882757.58156: variable 'network_state' from source: role '' defaults 27844 1726882757.58166: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27844 1726882757.58175: variable 'omit' from source: magic vars 27844 1726882757.58209: variable 'omit' from source: magic vars 27844 1726882757.58228: variable 'network_service_name' from source: role '' defaults 27844 1726882757.58284: variable 'network_service_name' from source: role '' defaults 27844 1726882757.58353: variable '__network_provider_setup' from source: role '' defaults 27844 1726882757.58363: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882757.58410: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882757.58417: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882757.58463: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882757.58613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882757.60089: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882757.60141: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882757.60170: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882757.60196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882757.60218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882757.60276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.60298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.60317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.60343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.60355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.60388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.60404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.60427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.60452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.60466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.60610: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882757.60687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.60704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.60721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.60749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.60759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.60822: variable 'ansible_python' from source: facts 27844 1726882757.60840: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882757.60896: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882757.60950: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882757.61032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.61048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.61071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.61098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.61108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.61141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882757.61164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882757.61183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.61208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882757.61218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882757.61310: variable 'network_connections' from source: task vars 27844 1726882757.61316: variable 'interface0' from source: play vars 27844 1726882757.61368: variable 'interface0' from source: play vars 27844 1726882757.61378: variable 'interface0' from source: play vars 27844 1726882757.61433: variable 'interface0' from source: play vars 27844 1726882757.61456: variable 'interface1' from source: play vars 27844 1726882757.61511: variable 'interface1' from source: play vars 27844 1726882757.61520: variable 'interface1' from source: play vars 27844 1726882757.61572: variable 'interface1' from source: play vars 27844 1726882757.61648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882757.61775: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882757.61812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882757.61841: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882757.61871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882757.61916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882757.61941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882757.61968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882757.61992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882757.62031: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.62204: variable 'network_connections' from source: task vars 27844 1726882757.62207: variable 'interface0' from source: play vars 27844 1726882757.62259: variable 'interface0' from source: play vars 27844 1726882757.62273: variable 'interface0' from source: play vars 27844 1726882757.62324: variable 'interface0' from source: play vars 27844 1726882757.62344: variable 'interface1' from source: play vars 27844 1726882757.62401: variable 'interface1' from source: play vars 27844 1726882757.62409: variable 'interface1' from source: play vars 27844 1726882757.62462: variable 'interface1' from source: play vars 27844 1726882757.62503: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882757.62555: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882757.62742: variable 'network_connections' from source: task vars 27844 1726882757.62745: variable 'interface0' from source: play vars 27844 1726882757.62800: variable 'interface0' from source: play vars 27844 1726882757.62803: variable 'interface0' from source: play vars 27844 1726882757.62850: variable 'interface0' from source: play vars 27844 1726882757.62861: variable 'interface1' from source: play vars 27844 1726882757.62913: variable 'interface1' from source: play vars 27844 1726882757.62919: variable 'interface1' from source: play vars 27844 1726882757.62967: variable 'interface1' from source: play vars 27844 1726882757.62990: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882757.63044: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882757.63231: variable 'network_connections' from source: task vars 27844 1726882757.63235: variable 'interface0' from source: play vars 27844 1726882757.63284: variable 'interface0' from source: play vars 27844 1726882757.63289: variable 'interface0' from source: play vars 27844 1726882757.63342: variable 'interface0' from source: play vars 27844 1726882757.63349: variable 'interface1' from source: play vars 27844 1726882757.63400: variable 'interface1' from source: play vars 27844 1726882757.63405: variable 'interface1' from source: play vars 27844 1726882757.63456: variable 'interface1' from source: play vars 27844 1726882757.63504: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882757.63548: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882757.63551: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882757.63597: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882757.63742: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882757.64045: variable 'network_connections' from source: task vars 27844 1726882757.64048: variable 'interface0' from source: play vars 27844 1726882757.64100: variable 'interface0' from source: play vars 27844 1726882757.64103: variable 'interface0' from source: play vars 27844 1726882757.64143: variable 'interface0' from source: play vars 27844 1726882757.64152: variable 'interface1' from source: play vars 27844 1726882757.64197: variable 'interface1' from source: play vars 27844 1726882757.64201: variable 'interface1' from source: play vars 27844 1726882757.64243: variable 'interface1' from source: play vars 27844 1726882757.64252: variable 'ansible_distribution' from source: facts 27844 1726882757.64255: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.64260: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.64283: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882757.64397: variable 'ansible_distribution' from source: facts 27844 1726882757.64400: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.64405: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.64416: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882757.64529: variable 'ansible_distribution' from source: facts 27844 1726882757.64534: variable '__network_rh_distros' from source: role '' defaults 27844 1726882757.64537: variable 'ansible_distribution_major_version' from source: facts 27844 1726882757.64559: variable 'network_provider' from source: set_fact 27844 1726882757.64579: variable 'omit' from source: magic vars 27844 1726882757.64598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882757.64619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882757.64633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882757.64647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882757.64656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882757.64681: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882757.64684: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.64687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.64752: Set connection var ansible_shell_type to sh 27844 1726882757.64756: Set connection var ansible_connection to ssh 27844 1726882757.64758: Set connection var ansible_pipelining to False 27844 1726882757.64766: Set connection var ansible_timeout to 10 27844 1726882757.64773: Set connection var ansible_shell_executable to /bin/sh 27844 1726882757.64778: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882757.64797: variable 'ansible_shell_executable' from source: unknown 27844 1726882757.64800: variable 'ansible_connection' from source: unknown 27844 1726882757.64802: variable 'ansible_module_compression' from source: unknown 27844 1726882757.64804: variable 'ansible_shell_type' from source: unknown 27844 1726882757.64806: variable 'ansible_shell_executable' from source: unknown 27844 1726882757.64809: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882757.64812: variable 'ansible_pipelining' from source: unknown 27844 1726882757.64815: variable 'ansible_timeout' from source: unknown 27844 1726882757.64818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882757.64890: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882757.64898: variable 'omit' from source: magic vars 27844 1726882757.64902: starting attempt loop 27844 1726882757.64905: running the handler 27844 1726882757.64958: variable 'ansible_facts' from source: unknown 27844 1726882757.65432: _low_level_execute_command(): starting 27844 1726882757.65438: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882757.65943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882757.65957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882757.65974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882757.65986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882757.65996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882757.66039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882757.66055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882757.66172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882757.67838: stdout chunk (state=3): >>>/root <<< 27844 1726882757.67943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882757.67997: stderr chunk (state=3): >>><<< 27844 1726882757.68000: stdout chunk (state=3): >>><<< 27844 1726882757.68017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882757.68025: _low_level_execute_command(): starting 27844 1726882757.68030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410 `" && echo ansible-tmp-1726882757.680156-28710-113002083346410="` echo /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410 `" ) && sleep 0' 27844 1726882757.68451: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882757.68463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882757.68485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882757.68501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882757.68548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882757.68560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882757.68658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882757.70515: stdout chunk (state=3): >>>ansible-tmp-1726882757.680156-28710-113002083346410=/root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410 <<< 27844 1726882757.70630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882757.70672: stderr chunk (state=3): >>><<< 27844 1726882757.70676: stdout chunk (state=3): >>><<< 27844 1726882757.70688: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882757.680156-28710-113002083346410=/root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882757.70711: variable 'ansible_module_compression' from source: unknown 27844 1726882757.70752: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 27844 1726882757.70758: ANSIBALLZ: Acquiring lock 27844 1726882757.70760: ANSIBALLZ: Lock acquired: 139916607833536 27844 1726882757.70773: ANSIBALLZ: Creating module 27844 1726882757.92352: ANSIBALLZ: Writing module into payload 27844 1726882757.92485: ANSIBALLZ: Writing module 27844 1726882757.92514: ANSIBALLZ: Renaming module 27844 1726882757.92518: ANSIBALLZ: Done creating module 27844 1726882757.92535: variable 'ansible_facts' from source: unknown 27844 1726882757.92633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/AnsiballZ_systemd.py 27844 1726882757.92748: Sending initial data 27844 1726882757.92757: Sent initial data (155 bytes) 27844 1726882757.93581: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882757.93598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882757.93614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882757.93633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882757.93686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882757.93698: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882757.93713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882757.93730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882757.93741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882757.93751: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882757.93761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882757.93786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882757.93802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882757.93813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882757.93823: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882757.93835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882757.93920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882757.93937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882757.93952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882757.94131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882757.95979: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882757.96068: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882757.96161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpd_uyd058 /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/AnsiballZ_systemd.py <<< 27844 1726882757.96248: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882757.98970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882757.99222: stderr chunk (state=3): >>><<< 27844 1726882757.99225: stdout chunk (state=3): >>><<< 27844 1726882757.99227: done transferring module to remote 27844 1726882757.99230: _low_level_execute_command(): starting 27844 1726882757.99232: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/ /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/AnsiballZ_systemd.py && sleep 0' 27844 1726882757.99814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882757.99828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882757.99843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882757.99862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882757.99912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882757.99926: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882757.99941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882757.99960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882757.99978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882757.99991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882758.00004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.00019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.00035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.00047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.00058: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882758.00078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.00153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.00180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.00196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.00322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882758.02073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882758.02152: stderr chunk (state=3): >>><<< 27844 1726882758.02167: stdout chunk (state=3): >>><<< 27844 1726882758.02259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882758.02272: _low_level_execute_command(): starting 27844 1726882758.02276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/AnsiballZ_systemd.py && sleep 0' 27844 1726882758.02872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882758.02887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.02902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.02920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.02974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.02988: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882758.03002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.03020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882758.03032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882758.03052: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882758.03070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.03086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.03102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.03115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.03126: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882758.03139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.03225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.03250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.03278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.03417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882758.28169: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 27844 1726882758.28224: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16232448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1300703000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogS<<< 27844 1726882758.28248: stdout chunk (state=3): >>>ignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27844 1726882758.29646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882758.29702: stderr chunk (state=3): >>><<< 27844 1726882758.29705: stdout chunk (state=3): >>><<< 27844 1726882758.29718: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16232448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1300703000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882758.29831: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882758.29837: _low_level_execute_command(): starting 27844 1726882758.29842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882757.680156-28710-113002083346410/ > /dev/null 2>&1 && sleep 0' 27844 1726882758.30271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.30278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.30283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.30314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.30326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.30383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.30386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.30394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.30498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882758.32283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882758.32323: stderr chunk (state=3): >>><<< 27844 1726882758.32326: stdout chunk (state=3): >>><<< 27844 1726882758.32337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882758.32343: handler run complete 27844 1726882758.32388: attempt loop complete, returning result 27844 1726882758.32391: _execute() done 27844 1726882758.32394: dumping result to json 27844 1726882758.32404: done dumping result, returning 27844 1726882758.32413: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-efa9-466a-000000000027] 27844 1726882758.32415: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000027 27844 1726882758.32647: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000027 27844 1726882758.32650: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882758.32703: no more pending results, returning what we have 27844 1726882758.32706: results queue empty 27844 1726882758.32707: checking for any_errors_fatal 27844 1726882758.32714: done checking for any_errors_fatal 27844 1726882758.32717: checking for max_fail_percentage 27844 1726882758.32719: done checking for max_fail_percentage 27844 1726882758.32720: checking to see if all hosts have failed and the running result is not ok 27844 1726882758.32721: done checking to see if all hosts have failed 27844 1726882758.32722: getting the remaining hosts for this loop 27844 1726882758.32726: done getting the remaining hosts for this loop 27844 1726882758.32730: getting the next task for host managed_node1 27844 1726882758.32736: done getting next task for host managed_node1 27844 1726882758.32740: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882758.32743: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882758.32758: getting variables 27844 1726882758.32760: in VariableManager get_vars() 27844 1726882758.32820: Calling all_inventory to load vars for managed_node1 27844 1726882758.32823: Calling groups_inventory to load vars for managed_node1 27844 1726882758.32825: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882758.32836: Calling all_plugins_play to load vars for managed_node1 27844 1726882758.32838: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882758.32841: Calling groups_plugins_play to load vars for managed_node1 27844 1726882758.34283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882758.35300: done with get_vars() 27844 1726882758.35316: done getting variables 27844 1726882758.35359: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:18 -0400 (0:00:00.781) 0:00:17.430 ****** 27844 1726882758.35387: entering _queue_task() for managed_node1/service 27844 1726882758.35593: worker is 1 (out of 1 available) 27844 1726882758.35607: exiting _queue_task() for managed_node1/service 27844 1726882758.35619: done queuing things up, now waiting for results queue to drain 27844 1726882758.35621: waiting for pending results... 27844 1726882758.35790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882758.35900: in run() - task 0e448fcc-3ce9-efa9-466a-000000000028 27844 1726882758.35931: variable 'ansible_search_path' from source: unknown 27844 1726882758.35940: variable 'ansible_search_path' from source: unknown 27844 1726882758.36016: calling self._execute() 27844 1726882758.36481: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.36491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.36503: variable 'omit' from source: magic vars 27844 1726882758.36852: variable 'ansible_distribution_major_version' from source: facts 27844 1726882758.36874: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882758.36992: variable 'network_provider' from source: set_fact 27844 1726882758.37003: Evaluated conditional (network_provider == "nm"): True 27844 1726882758.37104: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882758.37196: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882758.37360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882758.39606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882758.39674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882758.39713: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882758.39749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882758.39784: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882758.39880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882758.39913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882758.39943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882758.39992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882758.40012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882758.40062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882758.40093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882758.40116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882758.40152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882758.40173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882758.40211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882758.40234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882758.40257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882758.40308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882758.40328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882758.40475: variable 'network_connections' from source: task vars 27844 1726882758.40492: variable 'interface0' from source: play vars 27844 1726882758.40571: variable 'interface0' from source: play vars 27844 1726882758.40586: variable 'interface0' from source: play vars 27844 1726882758.40648: variable 'interface0' from source: play vars 27844 1726882758.40673: variable 'interface1' from source: play vars 27844 1726882758.40736: variable 'interface1' from source: play vars 27844 1726882758.40748: variable 'interface1' from source: play vars 27844 1726882758.40814: variable 'interface1' from source: play vars 27844 1726882758.40895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882758.41059: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882758.41106: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882758.41143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882758.41181: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882758.41226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882758.41252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882758.41288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882758.41320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882758.41377: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882758.41631: variable 'network_connections' from source: task vars 27844 1726882758.41642: variable 'interface0' from source: play vars 27844 1726882758.41707: variable 'interface0' from source: play vars 27844 1726882758.41719: variable 'interface0' from source: play vars 27844 1726882758.41793: variable 'interface0' from source: play vars 27844 1726882758.41810: variable 'interface1' from source: play vars 27844 1726882758.41876: variable 'interface1' from source: play vars 27844 1726882758.41889: variable 'interface1' from source: play vars 27844 1726882758.41950: variable 'interface1' from source: play vars 27844 1726882758.42002: Evaluated conditional (__network_wpa_supplicant_required): False 27844 1726882758.42010: when evaluation is False, skipping this task 27844 1726882758.42017: _execute() done 27844 1726882758.42027: dumping result to json 27844 1726882758.42035: done dumping result, returning 27844 1726882758.42046: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-efa9-466a-000000000028] 27844 1726882758.42054: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27844 1726882758.42207: no more pending results, returning what we have 27844 1726882758.42211: results queue empty 27844 1726882758.42212: checking for any_errors_fatal 27844 1726882758.42235: done checking for any_errors_fatal 27844 1726882758.42236: checking for max_fail_percentage 27844 1726882758.42239: done checking for max_fail_percentage 27844 1726882758.42240: checking to see if all hosts have failed and the running result is not ok 27844 1726882758.42240: done checking to see if all hosts have failed 27844 1726882758.42241: getting the remaining hosts for this loop 27844 1726882758.42243: done getting the remaining hosts for this loop 27844 1726882758.42247: getting the next task for host managed_node1 27844 1726882758.42254: done getting next task for host managed_node1 27844 1726882758.42259: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882758.42263: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882758.42282: getting variables 27844 1726882758.42284: in VariableManager get_vars() 27844 1726882758.42331: Calling all_inventory to load vars for managed_node1 27844 1726882758.42334: Calling groups_inventory to load vars for managed_node1 27844 1726882758.42337: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882758.42348: Calling all_plugins_play to load vars for managed_node1 27844 1726882758.42351: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882758.42354: Calling groups_plugins_play to load vars for managed_node1 27844 1726882758.43562: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000028 27844 1726882758.43570: WORKER PROCESS EXITING 27844 1726882758.44137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882758.46761: done with get_vars() 27844 1726882758.47190: done getting variables 27844 1726882758.47249: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:18 -0400 (0:00:00.118) 0:00:17.549 ****** 27844 1726882758.47289: entering _queue_task() for managed_node1/service 27844 1726882758.47876: worker is 1 (out of 1 available) 27844 1726882758.47888: exiting _queue_task() for managed_node1/service 27844 1726882758.47900: done queuing things up, now waiting for results queue to drain 27844 1726882758.47902: waiting for pending results... 27844 1726882758.48852: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882758.49082: in run() - task 0e448fcc-3ce9-efa9-466a-000000000029 27844 1726882758.49101: variable 'ansible_search_path' from source: unknown 27844 1726882758.49110: variable 'ansible_search_path' from source: unknown 27844 1726882758.49151: calling self._execute() 27844 1726882758.49251: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.49262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.49281: variable 'omit' from source: magic vars 27844 1726882758.49647: variable 'ansible_distribution_major_version' from source: facts 27844 1726882758.49674: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882758.49796: variable 'network_provider' from source: set_fact 27844 1726882758.49806: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882758.49813: when evaluation is False, skipping this task 27844 1726882758.49819: _execute() done 27844 1726882758.49826: dumping result to json 27844 1726882758.49832: done dumping result, returning 27844 1726882758.49842: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-efa9-466a-000000000029] 27844 1726882758.49851: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000029 27844 1726882758.49958: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000029 27844 1726882758.49971: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882758.50030: no more pending results, returning what we have 27844 1726882758.50035: results queue empty 27844 1726882758.50036: checking for any_errors_fatal 27844 1726882758.50045: done checking for any_errors_fatal 27844 1726882758.50046: checking for max_fail_percentage 27844 1726882758.50048: done checking for max_fail_percentage 27844 1726882758.50049: checking to see if all hosts have failed and the running result is not ok 27844 1726882758.50050: done checking to see if all hosts have failed 27844 1726882758.50051: getting the remaining hosts for this loop 27844 1726882758.50052: done getting the remaining hosts for this loop 27844 1726882758.50055: getting the next task for host managed_node1 27844 1726882758.50067: done getting next task for host managed_node1 27844 1726882758.50072: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882758.50075: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882758.50092: getting variables 27844 1726882758.50094: in VariableManager get_vars() 27844 1726882758.50137: Calling all_inventory to load vars for managed_node1 27844 1726882758.50139: Calling groups_inventory to load vars for managed_node1 27844 1726882758.50142: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882758.50154: Calling all_plugins_play to load vars for managed_node1 27844 1726882758.50157: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882758.50160: Calling groups_plugins_play to load vars for managed_node1 27844 1726882758.51899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882758.53573: done with get_vars() 27844 1726882758.53595: done getting variables 27844 1726882758.53655: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:18 -0400 (0:00:00.064) 0:00:17.613 ****** 27844 1726882758.53691: entering _queue_task() for managed_node1/copy 27844 1726882758.53932: worker is 1 (out of 1 available) 27844 1726882758.53944: exiting _queue_task() for managed_node1/copy 27844 1726882758.53954: done queuing things up, now waiting for results queue to drain 27844 1726882758.53955: waiting for pending results... 27844 1726882758.54220: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882758.54354: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002a 27844 1726882758.54379: variable 'ansible_search_path' from source: unknown 27844 1726882758.54386: variable 'ansible_search_path' from source: unknown 27844 1726882758.54423: calling self._execute() 27844 1726882758.54520: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.54532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.54547: variable 'omit' from source: magic vars 27844 1726882758.54916: variable 'ansible_distribution_major_version' from source: facts 27844 1726882758.54938: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882758.55070: variable 'network_provider' from source: set_fact 27844 1726882758.55083: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882758.55091: when evaluation is False, skipping this task 27844 1726882758.55097: _execute() done 27844 1726882758.55104: dumping result to json 27844 1726882758.55112: done dumping result, returning 27844 1726882758.55122: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-efa9-466a-00000000002a] 27844 1726882758.55133: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002a 27844 1726882758.55243: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002a 27844 1726882758.55251: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27844 1726882758.55314: no more pending results, returning what we have 27844 1726882758.55319: results queue empty 27844 1726882758.55320: checking for any_errors_fatal 27844 1726882758.55327: done checking for any_errors_fatal 27844 1726882758.55328: checking for max_fail_percentage 27844 1726882758.55330: done checking for max_fail_percentage 27844 1726882758.55331: checking to see if all hosts have failed and the running result is not ok 27844 1726882758.55332: done checking to see if all hosts have failed 27844 1726882758.55332: getting the remaining hosts for this loop 27844 1726882758.55334: done getting the remaining hosts for this loop 27844 1726882758.55338: getting the next task for host managed_node1 27844 1726882758.55345: done getting next task for host managed_node1 27844 1726882758.55349: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882758.55353: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882758.55374: getting variables 27844 1726882758.55376: in VariableManager get_vars() 27844 1726882758.55419: Calling all_inventory to load vars for managed_node1 27844 1726882758.55422: Calling groups_inventory to load vars for managed_node1 27844 1726882758.55425: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882758.55438: Calling all_plugins_play to load vars for managed_node1 27844 1726882758.55441: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882758.55445: Calling groups_plugins_play to load vars for managed_node1 27844 1726882758.57045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882758.58748: done with get_vars() 27844 1726882758.58773: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:18 -0400 (0:00:00.051) 0:00:17.665 ****** 27844 1726882758.58854: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882758.58856: Creating lock for fedora.linux_system_roles.network_connections 27844 1726882758.59096: worker is 1 (out of 1 available) 27844 1726882758.59108: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882758.59120: done queuing things up, now waiting for results queue to drain 27844 1726882758.59121: waiting for pending results... 27844 1726882758.59382: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882758.59516: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002b 27844 1726882758.59536: variable 'ansible_search_path' from source: unknown 27844 1726882758.59544: variable 'ansible_search_path' from source: unknown 27844 1726882758.59591: calling self._execute() 27844 1726882758.59685: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.59696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.59711: variable 'omit' from source: magic vars 27844 1726882758.60062: variable 'ansible_distribution_major_version' from source: facts 27844 1726882758.60084: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882758.60095: variable 'omit' from source: magic vars 27844 1726882758.60154: variable 'omit' from source: magic vars 27844 1726882758.60314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882758.62849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882758.62916: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882758.62959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882758.63001: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882758.63030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882758.63111: variable 'network_provider' from source: set_fact 27844 1726882758.63239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882758.63282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882758.63313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882758.63360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882758.63387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882758.63460: variable 'omit' from source: magic vars 27844 1726882758.63587: variable 'omit' from source: magic vars 27844 1726882758.63697: variable 'network_connections' from source: task vars 27844 1726882758.63712: variable 'interface0' from source: play vars 27844 1726882758.63784: variable 'interface0' from source: play vars 27844 1726882758.63799: variable 'interface0' from source: play vars 27844 1726882758.63859: variable 'interface0' from source: play vars 27844 1726882758.63883: variable 'interface1' from source: play vars 27844 1726882758.63946: variable 'interface1' from source: play vars 27844 1726882758.63957: variable 'interface1' from source: play vars 27844 1726882758.64028: variable 'interface1' from source: play vars 27844 1726882758.64269: variable 'omit' from source: magic vars 27844 1726882758.64284: variable '__lsr_ansible_managed' from source: task vars 27844 1726882758.64349: variable '__lsr_ansible_managed' from source: task vars 27844 1726882758.64593: Loaded config def from plugin (lookup/template) 27844 1726882758.64596: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27844 1726882758.64616: File lookup term: get_ansible_managed.j2 27844 1726882758.64619: variable 'ansible_search_path' from source: unknown 27844 1726882758.64622: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27844 1726882758.64632: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27844 1726882758.64645: variable 'ansible_search_path' from source: unknown 27844 1726882758.68580: variable 'ansible_managed' from source: unknown 27844 1726882758.68654: variable 'omit' from source: magic vars 27844 1726882758.68678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882758.68697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882758.68711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882758.68724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882758.68733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882758.68756: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882758.68759: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.68762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.68824: Set connection var ansible_shell_type to sh 27844 1726882758.68828: Set connection var ansible_connection to ssh 27844 1726882758.68835: Set connection var ansible_pipelining to False 27844 1726882758.68845: Set connection var ansible_timeout to 10 27844 1726882758.68850: Set connection var ansible_shell_executable to /bin/sh 27844 1726882758.68855: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882758.68877: variable 'ansible_shell_executable' from source: unknown 27844 1726882758.68880: variable 'ansible_connection' from source: unknown 27844 1726882758.68883: variable 'ansible_module_compression' from source: unknown 27844 1726882758.68885: variable 'ansible_shell_type' from source: unknown 27844 1726882758.68888: variable 'ansible_shell_executable' from source: unknown 27844 1726882758.68890: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882758.68892: variable 'ansible_pipelining' from source: unknown 27844 1726882758.68894: variable 'ansible_timeout' from source: unknown 27844 1726882758.68903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882758.68986: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882758.68994: variable 'omit' from source: magic vars 27844 1726882758.69001: starting attempt loop 27844 1726882758.69004: running the handler 27844 1726882758.69015: _low_level_execute_command(): starting 27844 1726882758.69021: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882758.69509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.69540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882758.69544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882758.69553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882758.69558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.69574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882758.69583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.69648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.69674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.69691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.69832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882758.71484: stdout chunk (state=3): >>>/root <<< 27844 1726882758.71582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882758.71636: stderr chunk (state=3): >>><<< 27844 1726882758.71638: stdout chunk (state=3): >>><<< 27844 1726882758.71665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882758.71671: _low_level_execute_command(): starting 27844 1726882758.71674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967 `" && echo ansible-tmp-1726882758.7165067-28739-57318209595967="` echo /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967 `" ) && sleep 0' 27844 1726882758.72347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882758.72355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.72364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.72383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.72407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.72413: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882758.72421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.72430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882758.72438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882758.72444: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.72456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.72461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.72514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.72529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.72539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.72648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882758.74496: stdout chunk (state=3): >>>ansible-tmp-1726882758.7165067-28739-57318209595967=/root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967 <<< 27844 1726882758.74641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882758.74705: stderr chunk (state=3): >>><<< 27844 1726882758.74709: stdout chunk (state=3): >>><<< 27844 1726882758.74726: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882758.7165067-28739-57318209595967=/root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882758.74775: variable 'ansible_module_compression' from source: unknown 27844 1726882758.74825: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 27844 1726882758.74828: ANSIBALLZ: Acquiring lock 27844 1726882758.74830: ANSIBALLZ: Lock acquired: 139916600952352 27844 1726882758.74833: ANSIBALLZ: Creating module 27844 1726882758.96134: ANSIBALLZ: Writing module into payload 27844 1726882758.96596: ANSIBALLZ: Writing module 27844 1726882758.96625: ANSIBALLZ: Renaming module 27844 1726882758.96635: ANSIBALLZ: Done creating module 27844 1726882758.96659: variable 'ansible_facts' from source: unknown 27844 1726882758.96752: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/AnsiballZ_network_connections.py 27844 1726882758.96919: Sending initial data 27844 1726882758.96922: Sent initial data (167 bytes) 27844 1726882758.97914: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882758.97929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.97944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.97965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.98011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.98024: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882758.98039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.98057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882758.98083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882758.98095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882758.98108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882758.98123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882758.98138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882758.98151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882758.98162: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882758.98181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882758.98254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882758.98283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882758.98301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882758.98433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.00280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882759.00360: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882759.00458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp58rmnkon /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/AnsiballZ_network_connections.py <<< 27844 1726882759.00554: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882759.02372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882759.02676: stderr chunk (state=3): >>><<< 27844 1726882759.02679: stdout chunk (state=3): >>><<< 27844 1726882759.02681: done transferring module to remote 27844 1726882759.02683: _low_level_execute_command(): starting 27844 1726882759.02685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/ /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/AnsiballZ_network_connections.py && sleep 0' 27844 1726882759.03420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882759.03429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.03439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.03452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.03494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.03500: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882759.03510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.03525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882759.03530: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882759.03537: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882759.03545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.03554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.03569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.03574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.03581: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882759.03590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.03662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882759.03682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882759.03694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882759.03810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.05624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882759.05627: stdout chunk (state=3): >>><<< 27844 1726882759.05634: stderr chunk (state=3): >>><<< 27844 1726882759.05661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882759.05673: _low_level_execute_command(): starting 27844 1726882759.05682: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/AnsiballZ_network_connections.py && sleep 0' 27844 1726882759.06532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882759.06778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.06812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882759.06833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882759.06847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882759.06985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.37597: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27844 1726882759.40075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882759.40079: stdout chunk (state=3): >>><<< 27844 1726882759.40090: stderr chunk (state=3): >>><<< 27844 1726882759.40276: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882759.40281: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882759.40288: _low_level_execute_command(): starting 27844 1726882759.40291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882758.7165067-28739-57318209595967/ > /dev/null 2>&1 && sleep 0' 27844 1726882759.42090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882759.42992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.43007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.43022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.43063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.43182: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882759.43196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.43213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882759.43223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882759.43232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882759.43242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.43253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.43271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.43283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.43292: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882759.43305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.43384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882759.43792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882759.43809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882759.43940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.45871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882759.45876: stdout chunk (state=3): >>><<< 27844 1726882759.45878: stderr chunk (state=3): >>><<< 27844 1726882759.46074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882759.46077: handler run complete 27844 1726882759.46080: attempt loop complete, returning result 27844 1726882759.46082: _execute() done 27844 1726882759.46084: dumping result to json 27844 1726882759.46086: done dumping result, returning 27844 1726882759.46088: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-efa9-466a-00000000002b] 27844 1726882759.46090: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002b 27844 1726882759.46185: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002b 27844 1726882759.46188: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active) 27844 1726882759.46371: no more pending results, returning what we have 27844 1726882759.46375: results queue empty 27844 1726882759.46376: checking for any_errors_fatal 27844 1726882759.46383: done checking for any_errors_fatal 27844 1726882759.46384: checking for max_fail_percentage 27844 1726882759.46386: done checking for max_fail_percentage 27844 1726882759.46387: checking to see if all hosts have failed and the running result is not ok 27844 1726882759.46388: done checking to see if all hosts have failed 27844 1726882759.46389: getting the remaining hosts for this loop 27844 1726882759.46390: done getting the remaining hosts for this loop 27844 1726882759.46394: getting the next task for host managed_node1 27844 1726882759.46401: done getting next task for host managed_node1 27844 1726882759.46405: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882759.46412: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882759.46423: getting variables 27844 1726882759.46425: in VariableManager get_vars() 27844 1726882759.46873: Calling all_inventory to load vars for managed_node1 27844 1726882759.46876: Calling groups_inventory to load vars for managed_node1 27844 1726882759.46879: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882759.46888: Calling all_plugins_play to load vars for managed_node1 27844 1726882759.46891: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882759.46894: Calling groups_plugins_play to load vars for managed_node1 27844 1726882759.49699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882759.53500: done with get_vars() 27844 1726882759.53531: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:19 -0400 (0:00:00.948) 0:00:18.614 ****** 27844 1726882759.53742: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882759.53744: Creating lock for fedora.linux_system_roles.network_state 27844 1726882759.54498: worker is 1 (out of 1 available) 27844 1726882759.54512: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882759.54525: done queuing things up, now waiting for results queue to drain 27844 1726882759.54641: waiting for pending results... 27844 1726882759.55479: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882759.55775: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002c 27844 1726882759.55871: variable 'ansible_search_path' from source: unknown 27844 1726882759.55881: variable 'ansible_search_path' from source: unknown 27844 1726882759.55926: calling self._execute() 27844 1726882759.56156: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.56179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.56295: variable 'omit' from source: magic vars 27844 1726882759.56991: variable 'ansible_distribution_major_version' from source: facts 27844 1726882759.57008: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882759.57142: variable 'network_state' from source: role '' defaults 27844 1726882759.57273: Evaluated conditional (network_state != {}): False 27844 1726882759.57376: when evaluation is False, skipping this task 27844 1726882759.57384: _execute() done 27844 1726882759.57391: dumping result to json 27844 1726882759.57399: done dumping result, returning 27844 1726882759.57410: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-efa9-466a-00000000002c] 27844 1726882759.57420: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882759.57581: no more pending results, returning what we have 27844 1726882759.57586: results queue empty 27844 1726882759.57587: checking for any_errors_fatal 27844 1726882759.57604: done checking for any_errors_fatal 27844 1726882759.57605: checking for max_fail_percentage 27844 1726882759.57607: done checking for max_fail_percentage 27844 1726882759.57608: checking to see if all hosts have failed and the running result is not ok 27844 1726882759.57609: done checking to see if all hosts have failed 27844 1726882759.57609: getting the remaining hosts for this loop 27844 1726882759.57611: done getting the remaining hosts for this loop 27844 1726882759.57615: getting the next task for host managed_node1 27844 1726882759.57621: done getting next task for host managed_node1 27844 1726882759.57625: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882759.57630: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882759.57646: getting variables 27844 1726882759.57649: in VariableManager get_vars() 27844 1726882759.57698: Calling all_inventory to load vars for managed_node1 27844 1726882759.57701: Calling groups_inventory to load vars for managed_node1 27844 1726882759.57704: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882759.57717: Calling all_plugins_play to load vars for managed_node1 27844 1726882759.57720: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882759.57724: Calling groups_plugins_play to load vars for managed_node1 27844 1726882759.58711: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002c 27844 1726882759.58714: WORKER PROCESS EXITING 27844 1726882759.59585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882759.61535: done with get_vars() 27844 1726882759.61556: done getting variables 27844 1726882759.61621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:19 -0400 (0:00:00.079) 0:00:18.693 ****** 27844 1726882759.61655: entering _queue_task() for managed_node1/debug 27844 1726882759.61948: worker is 1 (out of 1 available) 27844 1726882759.61960: exiting _queue_task() for managed_node1/debug 27844 1726882759.61974: done queuing things up, now waiting for results queue to drain 27844 1726882759.61976: waiting for pending results... 27844 1726882759.62279: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882759.62430: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002d 27844 1726882759.62456: variable 'ansible_search_path' from source: unknown 27844 1726882759.62475: variable 'ansible_search_path' from source: unknown 27844 1726882759.62517: calling self._execute() 27844 1726882759.62624: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.62642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.62657: variable 'omit' from source: magic vars 27844 1726882759.63046: variable 'ansible_distribution_major_version' from source: facts 27844 1726882759.63068: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882759.63083: variable 'omit' from source: magic vars 27844 1726882759.63144: variable 'omit' from source: magic vars 27844 1726882759.63190: variable 'omit' from source: magic vars 27844 1726882759.63248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882759.63296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882759.63325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882759.63347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.63368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.63404: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882759.63413: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.63425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.63570: Set connection var ansible_shell_type to sh 27844 1726882759.63580: Set connection var ansible_connection to ssh 27844 1726882759.63592: Set connection var ansible_pipelining to False 27844 1726882759.63602: Set connection var ansible_timeout to 10 27844 1726882759.63615: Set connection var ansible_shell_executable to /bin/sh 27844 1726882759.63627: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882759.63662: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.63679: variable 'ansible_connection' from source: unknown 27844 1726882759.63688: variable 'ansible_module_compression' from source: unknown 27844 1726882759.63695: variable 'ansible_shell_type' from source: unknown 27844 1726882759.63701: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.63708: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.63715: variable 'ansible_pipelining' from source: unknown 27844 1726882759.63721: variable 'ansible_timeout' from source: unknown 27844 1726882759.63732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.63886: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882759.63904: variable 'omit' from source: magic vars 27844 1726882759.63913: starting attempt loop 27844 1726882759.63919: running the handler 27844 1726882759.64057: variable '__network_connections_result' from source: set_fact 27844 1726882759.64128: handler run complete 27844 1726882759.64152: attempt loop complete, returning result 27844 1726882759.64163: _execute() done 27844 1726882759.64174: dumping result to json 27844 1726882759.64186: done dumping result, returning 27844 1726882759.64198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-efa9-466a-00000000002d] 27844 1726882759.64206: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002d ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active)" ] } 27844 1726882759.64372: no more pending results, returning what we have 27844 1726882759.64376: results queue empty 27844 1726882759.64377: checking for any_errors_fatal 27844 1726882759.64385: done checking for any_errors_fatal 27844 1726882759.64386: checking for max_fail_percentage 27844 1726882759.64387: done checking for max_fail_percentage 27844 1726882759.64388: checking to see if all hosts have failed and the running result is not ok 27844 1726882759.64389: done checking to see if all hosts have failed 27844 1726882759.64390: getting the remaining hosts for this loop 27844 1726882759.64392: done getting the remaining hosts for this loop 27844 1726882759.64396: getting the next task for host managed_node1 27844 1726882759.64402: done getting next task for host managed_node1 27844 1726882759.64406: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882759.64410: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882759.64421: getting variables 27844 1726882759.64423: in VariableManager get_vars() 27844 1726882759.64467: Calling all_inventory to load vars for managed_node1 27844 1726882759.64470: Calling groups_inventory to load vars for managed_node1 27844 1726882759.64473: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882759.64485: Calling all_plugins_play to load vars for managed_node1 27844 1726882759.64488: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882759.64491: Calling groups_plugins_play to load vars for managed_node1 27844 1726882759.65504: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002d 27844 1726882759.65508: WORKER PROCESS EXITING 27844 1726882759.66238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882759.68029: done with get_vars() 27844 1726882759.68050: done getting variables 27844 1726882759.68109: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:19 -0400 (0:00:00.064) 0:00:18.758 ****** 27844 1726882759.68151: entering _queue_task() for managed_node1/debug 27844 1726882759.68412: worker is 1 (out of 1 available) 27844 1726882759.68423: exiting _queue_task() for managed_node1/debug 27844 1726882759.68435: done queuing things up, now waiting for results queue to drain 27844 1726882759.68436: waiting for pending results... 27844 1726882759.68714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882759.68847: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002e 27844 1726882759.68870: variable 'ansible_search_path' from source: unknown 27844 1726882759.68882: variable 'ansible_search_path' from source: unknown 27844 1726882759.68925: calling self._execute() 27844 1726882759.69025: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.69036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.69049: variable 'omit' from source: magic vars 27844 1726882759.69430: variable 'ansible_distribution_major_version' from source: facts 27844 1726882759.69450: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882759.69460: variable 'omit' from source: magic vars 27844 1726882759.69524: variable 'omit' from source: magic vars 27844 1726882759.69571: variable 'omit' from source: magic vars 27844 1726882759.69612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882759.69655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882759.69684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882759.69705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.69720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.69754: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882759.69772: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.69781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.69890: Set connection var ansible_shell_type to sh 27844 1726882759.69898: Set connection var ansible_connection to ssh 27844 1726882759.69907: Set connection var ansible_pipelining to False 27844 1726882759.69916: Set connection var ansible_timeout to 10 27844 1726882759.69924: Set connection var ansible_shell_executable to /bin/sh 27844 1726882759.69933: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882759.69962: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.69979: variable 'ansible_connection' from source: unknown 27844 1726882759.69986: variable 'ansible_module_compression' from source: unknown 27844 1726882759.69992: variable 'ansible_shell_type' from source: unknown 27844 1726882759.69998: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.70004: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.70010: variable 'ansible_pipelining' from source: unknown 27844 1726882759.70016: variable 'ansible_timeout' from source: unknown 27844 1726882759.70023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.70158: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882759.70179: variable 'omit' from source: magic vars 27844 1726882759.70195: starting attempt loop 27844 1726882759.70202: running the handler 27844 1726882759.70253: variable '__network_connections_result' from source: set_fact 27844 1726882759.70342: variable '__network_connections_result' from source: set_fact 27844 1726882759.70569: handler run complete 27844 1726882759.70618: attempt loop complete, returning result 27844 1726882759.70630: _execute() done 27844 1726882759.70636: dumping result to json 27844 1726882759.70645: done dumping result, returning 27844 1726882759.70656: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-efa9-466a-00000000002e] 27844 1726882759.70669: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002e ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, abbd2acd-64d2-4926-8931-5a572400bc47 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 (not-active)" ] } } 27844 1726882759.70907: no more pending results, returning what we have 27844 1726882759.70912: results queue empty 27844 1726882759.70913: checking for any_errors_fatal 27844 1726882759.70922: done checking for any_errors_fatal 27844 1726882759.70923: checking for max_fail_percentage 27844 1726882759.70925: done checking for max_fail_percentage 27844 1726882759.70926: checking to see if all hosts have failed and the running result is not ok 27844 1726882759.70927: done checking to see if all hosts have failed 27844 1726882759.70927: getting the remaining hosts for this loop 27844 1726882759.70929: done getting the remaining hosts for this loop 27844 1726882759.70932: getting the next task for host managed_node1 27844 1726882759.70938: done getting next task for host managed_node1 27844 1726882759.70941: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882759.70944: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882759.70955: getting variables 27844 1726882759.70957: in VariableManager get_vars() 27844 1726882759.71000: Calling all_inventory to load vars for managed_node1 27844 1726882759.71003: Calling groups_inventory to load vars for managed_node1 27844 1726882759.71006: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882759.71016: Calling all_plugins_play to load vars for managed_node1 27844 1726882759.71019: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882759.71022: Calling groups_plugins_play to load vars for managed_node1 27844 1726882759.72003: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002e 27844 1726882759.72007: WORKER PROCESS EXITING 27844 1726882759.72839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882759.75694: done with get_vars() 27844 1726882759.75833: done getting variables 27844 1726882759.75901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:19 -0400 (0:00:00.078) 0:00:18.837 ****** 27844 1726882759.76050: entering _queue_task() for managed_node1/debug 27844 1726882759.76727: worker is 1 (out of 1 available) 27844 1726882759.76741: exiting _queue_task() for managed_node1/debug 27844 1726882759.76754: done queuing things up, now waiting for results queue to drain 27844 1726882759.76755: waiting for pending results... 27844 1726882759.78332: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882759.78896: in run() - task 0e448fcc-3ce9-efa9-466a-00000000002f 27844 1726882759.78918: variable 'ansible_search_path' from source: unknown 27844 1726882759.78926: variable 'ansible_search_path' from source: unknown 27844 1726882759.78972: calling self._execute() 27844 1726882759.79069: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.79085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.79101: variable 'omit' from source: magic vars 27844 1726882759.79719: variable 'ansible_distribution_major_version' from source: facts 27844 1726882759.79997: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882759.80147: variable 'network_state' from source: role '' defaults 27844 1726882759.80182: Evaluated conditional (network_state != {}): False 27844 1726882759.80191: when evaluation is False, skipping this task 27844 1726882759.80198: _execute() done 27844 1726882759.80205: dumping result to json 27844 1726882759.80212: done dumping result, returning 27844 1726882759.80222: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-efa9-466a-00000000002f] 27844 1726882759.80230: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002f skipping: [managed_node1] => { "false_condition": "network_state != {}" } 27844 1726882759.80376: no more pending results, returning what we have 27844 1726882759.80381: results queue empty 27844 1726882759.80382: checking for any_errors_fatal 27844 1726882759.80402: done checking for any_errors_fatal 27844 1726882759.80403: checking for max_fail_percentage 27844 1726882759.80405: done checking for max_fail_percentage 27844 1726882759.80406: checking to see if all hosts have failed and the running result is not ok 27844 1726882759.80408: done checking to see if all hosts have failed 27844 1726882759.80408: getting the remaining hosts for this loop 27844 1726882759.80410: done getting the remaining hosts for this loop 27844 1726882759.80414: getting the next task for host managed_node1 27844 1726882759.80420: done getting next task for host managed_node1 27844 1726882759.80425: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882759.80428: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882759.80444: getting variables 27844 1726882759.80446: in VariableManager get_vars() 27844 1726882759.80489: Calling all_inventory to load vars for managed_node1 27844 1726882759.80492: Calling groups_inventory to load vars for managed_node1 27844 1726882759.80495: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882759.80508: Calling all_plugins_play to load vars for managed_node1 27844 1726882759.80511: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882759.80513: Calling groups_plugins_play to load vars for managed_node1 27844 1726882759.81582: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000002f 27844 1726882759.81585: WORKER PROCESS EXITING 27844 1726882759.82208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882759.84771: done with get_vars() 27844 1726882759.84797: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:19 -0400 (0:00:00.088) 0:00:18.925 ****** 27844 1726882759.84897: entering _queue_task() for managed_node1/ping 27844 1726882759.84899: Creating lock for ping 27844 1726882759.85192: worker is 1 (out of 1 available) 27844 1726882759.85205: exiting _queue_task() for managed_node1/ping 27844 1726882759.85217: done queuing things up, now waiting for results queue to drain 27844 1726882759.85219: waiting for pending results... 27844 1726882759.85494: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882759.85632: in run() - task 0e448fcc-3ce9-efa9-466a-000000000030 27844 1726882759.85653: variable 'ansible_search_path' from source: unknown 27844 1726882759.85663: variable 'ansible_search_path' from source: unknown 27844 1726882759.85705: calling self._execute() 27844 1726882759.85806: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.85817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.85829: variable 'omit' from source: magic vars 27844 1726882759.86305: variable 'ansible_distribution_major_version' from source: facts 27844 1726882759.86322: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882759.86332: variable 'omit' from source: magic vars 27844 1726882759.86389: variable 'omit' from source: magic vars 27844 1726882759.86429: variable 'omit' from source: magic vars 27844 1726882759.86479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882759.86517: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882759.86540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882759.86564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.86586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882759.86616: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882759.86625: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.86632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.86734: Set connection var ansible_shell_type to sh 27844 1726882759.86741: Set connection var ansible_connection to ssh 27844 1726882759.86751: Set connection var ansible_pipelining to False 27844 1726882759.86760: Set connection var ansible_timeout to 10 27844 1726882759.86773: Set connection var ansible_shell_executable to /bin/sh 27844 1726882759.86782: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882759.86816: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.86823: variable 'ansible_connection' from source: unknown 27844 1726882759.86830: variable 'ansible_module_compression' from source: unknown 27844 1726882759.86836: variable 'ansible_shell_type' from source: unknown 27844 1726882759.86841: variable 'ansible_shell_executable' from source: unknown 27844 1726882759.86847: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882759.86854: variable 'ansible_pipelining' from source: unknown 27844 1726882759.86860: variable 'ansible_timeout' from source: unknown 27844 1726882759.86871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882759.87063: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882759.87125: variable 'omit' from source: magic vars 27844 1726882759.87134: starting attempt loop 27844 1726882759.87141: running the handler 27844 1726882759.87156: _low_level_execute_command(): starting 27844 1726882759.87177: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882759.88746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882759.88758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.88774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.88794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.88827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.88834: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882759.88845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.88856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882759.88866: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882759.88875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882759.88882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.88895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.88906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.88912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.88919: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882759.88928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.88997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882759.89017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882759.89029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882759.89155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.91484: stdout chunk (state=3): >>>/root <<< 27844 1726882759.91561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882759.91567: stdout chunk (state=3): >>><<< 27844 1726882759.91788: stderr chunk (state=3): >>><<< 27844 1726882759.91807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882759.91821: _low_level_execute_command(): starting 27844 1726882759.91826: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106 `" && echo ansible-tmp-1726882759.918079-28782-268077087991106="` echo /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106 `" ) && sleep 0' 27844 1726882759.93020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882759.93023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.93026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.93029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.93031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.93033: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882759.93035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.93041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882759.93049: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882759.93055: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882759.93070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882759.93078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882759.93090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882759.93098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882759.93113: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882759.93123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882759.93205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882759.93212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882759.93220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882759.93346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882759.95234: stdout chunk (state=3): >>>ansible-tmp-1726882759.918079-28782-268077087991106=/root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106 <<< 27844 1726882759.95344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882759.95410: stderr chunk (state=3): >>><<< 27844 1726882759.95413: stdout chunk (state=3): >>><<< 27844 1726882759.95669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882759.918079-28782-268077087991106=/root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882759.95673: variable 'ansible_module_compression' from source: unknown 27844 1726882759.95675: ANSIBALLZ: Using lock for ping 27844 1726882759.95677: ANSIBALLZ: Acquiring lock 27844 1726882759.95679: ANSIBALLZ: Lock acquired: 139916600953840 27844 1726882759.95681: ANSIBALLZ: Creating module 27844 1726882760.08680: ANSIBALLZ: Writing module into payload 27844 1726882760.08752: ANSIBALLZ: Writing module 27844 1726882760.08789: ANSIBALLZ: Renaming module 27844 1726882760.08800: ANSIBALLZ: Done creating module 27844 1726882760.08819: variable 'ansible_facts' from source: unknown 27844 1726882760.08899: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/AnsiballZ_ping.py 27844 1726882760.09059: Sending initial data 27844 1726882760.09062: Sent initial data (152 bytes) 27844 1726882760.10120: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882760.10135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.10150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.10171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.10222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.10234: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882760.10248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.10271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882760.10289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882760.10305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882760.10319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.10333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.10350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.10362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.10377: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882760.10394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.10475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.10499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.10523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.10668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.12465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882760.12549: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882760.12639: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmplh0e7ojn /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/AnsiballZ_ping.py <<< 27844 1726882760.12732: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882760.13724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.13969: stderr chunk (state=3): >>><<< 27844 1726882760.13973: stdout chunk (state=3): >>><<< 27844 1726882760.13975: done transferring module to remote 27844 1726882760.13977: _low_level_execute_command(): starting 27844 1726882760.13980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/ /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/AnsiballZ_ping.py && sleep 0' 27844 1726882760.14514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.14518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.14549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882760.14552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.14556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.14647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.14655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.14754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.16460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.16511: stderr chunk (state=3): >>><<< 27844 1726882760.16518: stdout chunk (state=3): >>><<< 27844 1726882760.16585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.16588: _low_level_execute_command(): starting 27844 1726882760.16590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/AnsiballZ_ping.py && sleep 0' 27844 1726882760.17101: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882760.17114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.17132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.17149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.17194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.17206: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882760.17220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.17242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882760.17254: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882760.17270: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882760.17284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.17298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.17314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.17326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.17341: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882760.17355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.17434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.17453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.17472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.17601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.30489: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27844 1726882760.31475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882760.31479: stdout chunk (state=3): >>><<< 27844 1726882760.31486: stderr chunk (state=3): >>><<< 27844 1726882760.31503: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882760.31528: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882760.31535: _low_level_execute_command(): starting 27844 1726882760.31541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882759.918079-28782-268077087991106/ > /dev/null 2>&1 && sleep 0' 27844 1726882760.33278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.33282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.33456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.33460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882760.33477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.33480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.33494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882760.33499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.33581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.33661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.33767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.35626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.35630: stderr chunk (state=3): >>><<< 27844 1726882760.35639: stdout chunk (state=3): >>><<< 27844 1726882760.35653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.35659: handler run complete 27844 1726882760.35679: attempt loop complete, returning result 27844 1726882760.35682: _execute() done 27844 1726882760.35685: dumping result to json 27844 1726882760.35687: done dumping result, returning 27844 1726882760.35696: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-efa9-466a-000000000030] 27844 1726882760.35701: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000030 27844 1726882760.35798: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000030 27844 1726882760.35800: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 27844 1726882760.35862: no more pending results, returning what we have 27844 1726882760.35869: results queue empty 27844 1726882760.35870: checking for any_errors_fatal 27844 1726882760.35878: done checking for any_errors_fatal 27844 1726882760.35878: checking for max_fail_percentage 27844 1726882760.35880: done checking for max_fail_percentage 27844 1726882760.35881: checking to see if all hosts have failed and the running result is not ok 27844 1726882760.35881: done checking to see if all hosts have failed 27844 1726882760.35882: getting the remaining hosts for this loop 27844 1726882760.35883: done getting the remaining hosts for this loop 27844 1726882760.35887: getting the next task for host managed_node1 27844 1726882760.35896: done getting next task for host managed_node1 27844 1726882760.35899: ^ task is: TASK: meta (role_complete) 27844 1726882760.35902: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882760.35912: getting variables 27844 1726882760.35915: in VariableManager get_vars() 27844 1726882760.35959: Calling all_inventory to load vars for managed_node1 27844 1726882760.35961: Calling groups_inventory to load vars for managed_node1 27844 1726882760.35968: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882760.35978: Calling all_plugins_play to load vars for managed_node1 27844 1726882760.35980: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882760.35983: Calling groups_plugins_play to load vars for managed_node1 27844 1726882760.38970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882760.41191: done with get_vars() 27844 1726882760.41218: done getting variables 27844 1726882760.41329: done queuing things up, now waiting for results queue to drain 27844 1726882760.41331: results queue empty 27844 1726882760.41332: checking for any_errors_fatal 27844 1726882760.41335: done checking for any_errors_fatal 27844 1726882760.41335: checking for max_fail_percentage 27844 1726882760.41336: done checking for max_fail_percentage 27844 1726882760.41337: checking to see if all hosts have failed and the running result is not ok 27844 1726882760.41338: done checking to see if all hosts have failed 27844 1726882760.41338: getting the remaining hosts for this loop 27844 1726882760.41339: done getting the remaining hosts for this loop 27844 1726882760.41342: getting the next task for host managed_node1 27844 1726882760.41345: done getting next task for host managed_node1 27844 1726882760.41347: ^ task is: TASK: Get the IPv4 routes from the route table main 27844 1726882760.41349: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882760.41351: getting variables 27844 1726882760.41352: in VariableManager get_vars() 27844 1726882760.41369: Calling all_inventory to load vars for managed_node1 27844 1726882760.41372: Calling groups_inventory to load vars for managed_node1 27844 1726882760.41374: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882760.41378: Calling all_plugins_play to load vars for managed_node1 27844 1726882760.41380: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882760.41386: Calling groups_plugins_play to load vars for managed_node1 27844 1726882760.42823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882760.49343: done with get_vars() 27844 1726882760.49392: done getting variables 27844 1726882760.49433: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Friday 20 September 2024 21:39:20 -0400 (0:00:00.645) 0:00:19.571 ****** 27844 1726882760.49456: entering _queue_task() for managed_node1/command 27844 1726882760.49839: worker is 1 (out of 1 available) 27844 1726882760.49853: exiting _queue_task() for managed_node1/command 27844 1726882760.49866: done queuing things up, now waiting for results queue to drain 27844 1726882760.49868: waiting for pending results... 27844 1726882760.50481: running TaskExecutor() for managed_node1/TASK: Get the IPv4 routes from the route table main 27844 1726882760.50487: in run() - task 0e448fcc-3ce9-efa9-466a-000000000060 27844 1726882760.50490: variable 'ansible_search_path' from source: unknown 27844 1726882760.50493: calling self._execute() 27844 1726882760.50496: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.50501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.50504: variable 'omit' from source: magic vars 27844 1726882760.50842: variable 'ansible_distribution_major_version' from source: facts 27844 1726882760.50862: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882760.50871: variable 'omit' from source: magic vars 27844 1726882760.50890: variable 'omit' from source: magic vars 27844 1726882760.50925: variable 'omit' from source: magic vars 27844 1726882760.50976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882760.51014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882760.51029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882760.51043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.51057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.51083: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882760.51086: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.51088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.51200: Set connection var ansible_shell_type to sh 27844 1726882760.51204: Set connection var ansible_connection to ssh 27844 1726882760.51206: Set connection var ansible_pipelining to False 27844 1726882760.51214: Set connection var ansible_timeout to 10 27844 1726882760.51219: Set connection var ansible_shell_executable to /bin/sh 27844 1726882760.51228: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882760.51278: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.51282: variable 'ansible_connection' from source: unknown 27844 1726882760.51284: variable 'ansible_module_compression' from source: unknown 27844 1726882760.51287: variable 'ansible_shell_type' from source: unknown 27844 1726882760.51289: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.51292: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.51295: variable 'ansible_pipelining' from source: unknown 27844 1726882760.51297: variable 'ansible_timeout' from source: unknown 27844 1726882760.51299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.51489: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882760.51503: variable 'omit' from source: magic vars 27844 1726882760.51506: starting attempt loop 27844 1726882760.51509: running the handler 27844 1726882760.51523: _low_level_execute_command(): starting 27844 1726882760.51538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882760.52359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.52393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882760.52397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.52404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.52547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.52551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.52554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.52958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.54291: stdout chunk (state=3): >>>/root <<< 27844 1726882760.54397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.54437: stderr chunk (state=3): >>><<< 27844 1726882760.54440: stdout chunk (state=3): >>><<< 27844 1726882760.54462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.54474: _low_level_execute_command(): starting 27844 1726882760.54479: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801 `" && echo ansible-tmp-1726882760.5445914-28806-43150895722801="` echo /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801 `" ) && sleep 0' 27844 1726882760.55044: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882760.55048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.55073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.55109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.55123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.55140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.55144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.55234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.55257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.55388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.57252: stdout chunk (state=3): >>>ansible-tmp-1726882760.5445914-28806-43150895722801=/root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801 <<< 27844 1726882760.57375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.57456: stderr chunk (state=3): >>><<< 27844 1726882760.57459: stdout chunk (state=3): >>><<< 27844 1726882760.57477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882760.5445914-28806-43150895722801=/root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.57512: variable 'ansible_module_compression' from source: unknown 27844 1726882760.57570: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882760.57611: variable 'ansible_facts' from source: unknown 27844 1726882760.57688: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/AnsiballZ_command.py 27844 1726882760.58000: Sending initial data 27844 1726882760.58007: Sent initial data (155 bytes) 27844 1726882760.58753: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882760.58762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.58771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.58792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.58815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.58822: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882760.58830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.58840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882760.58847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.58855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.58861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.58917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.58938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.58941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.59041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.60770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882760.60856: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882760.60949: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpwnd8g7fh /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/AnsiballZ_command.py <<< 27844 1726882760.61039: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882760.62047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.62156: stderr chunk (state=3): >>><<< 27844 1726882760.62159: stdout chunk (state=3): >>><<< 27844 1726882760.62183: done transferring module to remote 27844 1726882760.62192: _low_level_execute_command(): starting 27844 1726882760.62197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/ /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/AnsiballZ_command.py && sleep 0' 27844 1726882760.62657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.62671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.62705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.62710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882760.62719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.62725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.62734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.62740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.62802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.62806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.62814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.62920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.64642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.64697: stderr chunk (state=3): >>><<< 27844 1726882760.64701: stdout chunk (state=3): >>><<< 27844 1726882760.64720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.64724: _low_level_execute_command(): starting 27844 1726882760.64727: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/AnsiballZ_command.py && sleep 0' 27844 1726882760.65177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.65183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.65213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.65226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.65283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.65295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.65397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.79075: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 21:39:20.785697", "end": "2024-09-20 21:39:20.789114", "delta": "0:00:00.003417", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882760.80217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882760.80278: stderr chunk (state=3): >>><<< 27844 1726882760.80282: stdout chunk (state=3): >>><<< 27844 1726882760.80301: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-20 21:39:20.785697", "end": "2024-09-20 21:39:20.789114", "delta": "0:00:00.003417", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882760.80333: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882760.80339: _low_level_execute_command(): starting 27844 1726882760.80343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882760.5445914-28806-43150895722801/ > /dev/null 2>&1 && sleep 0' 27844 1726882760.80807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.80815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.80846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.80858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.80872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.80923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.80932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.81035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.82828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.82876: stderr chunk (state=3): >>><<< 27844 1726882760.82879: stdout chunk (state=3): >>><<< 27844 1726882760.82893: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.82899: handler run complete 27844 1726882760.82917: Evaluated conditional (False): False 27844 1726882760.82927: attempt loop complete, returning result 27844 1726882760.82930: _execute() done 27844 1726882760.82932: dumping result to json 27844 1726882760.82937: done dumping result, returning 27844 1726882760.82944: done running TaskExecutor() for managed_node1/TASK: Get the IPv4 routes from the route table main [0e448fcc-3ce9-efa9-466a-000000000060] 27844 1726882760.82948: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000060 27844 1726882760.83048: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000060 27844 1726882760.83051: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.003417", "end": "2024-09-20 21:39:20.789114", "rc": 0, "start": "2024-09-20 21:39:20.785697" } STDOUT: default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 27844 1726882760.83127: no more pending results, returning what we have 27844 1726882760.83131: results queue empty 27844 1726882760.83131: checking for any_errors_fatal 27844 1726882760.83133: done checking for any_errors_fatal 27844 1726882760.83134: checking for max_fail_percentage 27844 1726882760.83136: done checking for max_fail_percentage 27844 1726882760.83136: checking to see if all hosts have failed and the running result is not ok 27844 1726882760.83137: done checking to see if all hosts have failed 27844 1726882760.83138: getting the remaining hosts for this loop 27844 1726882760.83139: done getting the remaining hosts for this loop 27844 1726882760.83143: getting the next task for host managed_node1 27844 1726882760.83148: done getting next task for host managed_node1 27844 1726882760.83151: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 27844 1726882760.83152: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882760.83157: getting variables 27844 1726882760.83158: in VariableManager get_vars() 27844 1726882760.83200: Calling all_inventory to load vars for managed_node1 27844 1726882760.83203: Calling groups_inventory to load vars for managed_node1 27844 1726882760.83205: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882760.83216: Calling all_plugins_play to load vars for managed_node1 27844 1726882760.83219: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882760.83221: Calling groups_plugins_play to load vars for managed_node1 27844 1726882760.84407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882760.85677: done with get_vars() 27844 1726882760.85693: done getting variables 27844 1726882760.85737: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Friday 20 September 2024 21:39:20 -0400 (0:00:00.363) 0:00:19.934 ****** 27844 1726882760.85760: entering _queue_task() for managed_node1/assert 27844 1726882760.85982: worker is 1 (out of 1 available) 27844 1726882760.85996: exiting _queue_task() for managed_node1/assert 27844 1726882760.86009: done queuing things up, now waiting for results queue to drain 27844 1726882760.86011: waiting for pending results... 27844 1726882760.86237: running TaskExecutor() for managed_node1/TASK: Assert that the route table main contains the specified IPv4 routes 27844 1726882760.86313: in run() - task 0e448fcc-3ce9-efa9-466a-000000000061 27844 1726882760.86324: variable 'ansible_search_path' from source: unknown 27844 1726882760.86359: calling self._execute() 27844 1726882760.86439: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.86443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.86451: variable 'omit' from source: magic vars 27844 1726882760.86732: variable 'ansible_distribution_major_version' from source: facts 27844 1726882760.86742: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882760.86747: variable 'omit' from source: magic vars 27844 1726882760.86768: variable 'omit' from source: magic vars 27844 1726882760.86795: variable 'omit' from source: magic vars 27844 1726882760.86827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882760.86853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882760.86874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882760.87176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.87179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.87181: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882760.87184: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.87187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.87190: Set connection var ansible_shell_type to sh 27844 1726882760.87192: Set connection var ansible_connection to ssh 27844 1726882760.87195: Set connection var ansible_pipelining to False 27844 1726882760.87197: Set connection var ansible_timeout to 10 27844 1726882760.87199: Set connection var ansible_shell_executable to /bin/sh 27844 1726882760.87202: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882760.87204: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.87206: variable 'ansible_connection' from source: unknown 27844 1726882760.87209: variable 'ansible_module_compression' from source: unknown 27844 1726882760.87212: variable 'ansible_shell_type' from source: unknown 27844 1726882760.87214: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.87217: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.87219: variable 'ansible_pipelining' from source: unknown 27844 1726882760.87221: variable 'ansible_timeout' from source: unknown 27844 1726882760.87223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.87372: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882760.87377: variable 'omit' from source: magic vars 27844 1726882760.87379: starting attempt loop 27844 1726882760.87381: running the handler 27844 1726882760.87682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882760.87851: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882760.87891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882760.87940: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882760.87974: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882760.88089: variable 'route_table_main_ipv4' from source: set_fact 27844 1726882760.88092: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 27844 1726882760.88224: variable 'route_table_main_ipv4' from source: set_fact 27844 1726882760.88254: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 27844 1726882760.88258: handler run complete 27844 1726882760.88299: attempt loop complete, returning result 27844 1726882760.88303: _execute() done 27844 1726882760.88306: dumping result to json 27844 1726882760.88308: done dumping result, returning 27844 1726882760.88310: done running TaskExecutor() for managed_node1/TASK: Assert that the route table main contains the specified IPv4 routes [0e448fcc-3ce9-efa9-466a-000000000061] 27844 1726882760.88313: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000061 27844 1726882760.88380: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000061 27844 1726882760.88382: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882760.88597: no more pending results, returning what we have 27844 1726882760.88600: results queue empty 27844 1726882760.88601: checking for any_errors_fatal 27844 1726882760.88608: done checking for any_errors_fatal 27844 1726882760.88608: checking for max_fail_percentage 27844 1726882760.88610: done checking for max_fail_percentage 27844 1726882760.88610: checking to see if all hosts have failed and the running result is not ok 27844 1726882760.88611: done checking to see if all hosts have failed 27844 1726882760.88612: getting the remaining hosts for this loop 27844 1726882760.88613: done getting the remaining hosts for this loop 27844 1726882760.88616: getting the next task for host managed_node1 27844 1726882760.88621: done getting next task for host managed_node1 27844 1726882760.88623: ^ task is: TASK: Get the IPv6 routes from the route table main 27844 1726882760.88625: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882760.88628: getting variables 27844 1726882760.88630: in VariableManager get_vars() 27844 1726882760.88755: Calling all_inventory to load vars for managed_node1 27844 1726882760.88758: Calling groups_inventory to load vars for managed_node1 27844 1726882760.88761: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882760.88772: Calling all_plugins_play to load vars for managed_node1 27844 1726882760.88775: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882760.88778: Calling groups_plugins_play to load vars for managed_node1 27844 1726882760.90748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882760.93349: done with get_vars() 27844 1726882760.93377: done getting variables 27844 1726882760.93438: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Friday 20 September 2024 21:39:20 -0400 (0:00:00.077) 0:00:20.011 ****** 27844 1726882760.93491: entering _queue_task() for managed_node1/command 27844 1726882760.93833: worker is 1 (out of 1 available) 27844 1726882760.93846: exiting _queue_task() for managed_node1/command 27844 1726882760.93859: done queuing things up, now waiting for results queue to drain 27844 1726882760.93861: waiting for pending results... 27844 1726882760.94157: running TaskExecutor() for managed_node1/TASK: Get the IPv6 routes from the route table main 27844 1726882760.94245: in run() - task 0e448fcc-3ce9-efa9-466a-000000000062 27844 1726882760.94258: variable 'ansible_search_path' from source: unknown 27844 1726882760.94299: calling self._execute() 27844 1726882760.94408: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.94417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.94459: variable 'omit' from source: magic vars 27844 1726882760.95673: variable 'ansible_distribution_major_version' from source: facts 27844 1726882760.95686: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882760.95691: variable 'omit' from source: magic vars 27844 1726882760.95713: variable 'omit' from source: magic vars 27844 1726882760.95750: variable 'omit' from source: magic vars 27844 1726882760.95793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882760.95828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882760.95850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882760.95872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.95884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882760.95914: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882760.95919: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.95923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.96021: Set connection var ansible_shell_type to sh 27844 1726882760.96025: Set connection var ansible_connection to ssh 27844 1726882760.96028: Set connection var ansible_pipelining to False 27844 1726882760.96035: Set connection var ansible_timeout to 10 27844 1726882760.96040: Set connection var ansible_shell_executable to /bin/sh 27844 1726882760.96044: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882760.96077: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.96081: variable 'ansible_connection' from source: unknown 27844 1726882760.96083: variable 'ansible_module_compression' from source: unknown 27844 1726882760.96086: variable 'ansible_shell_type' from source: unknown 27844 1726882760.96088: variable 'ansible_shell_executable' from source: unknown 27844 1726882760.96091: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882760.96093: variable 'ansible_pipelining' from source: unknown 27844 1726882760.96097: variable 'ansible_timeout' from source: unknown 27844 1726882760.96102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882760.96228: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882760.96238: variable 'omit' from source: magic vars 27844 1726882760.96243: starting attempt loop 27844 1726882760.96246: running the handler 27844 1726882760.96260: _low_level_execute_command(): starting 27844 1726882760.96272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882760.96982: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882760.96993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.97003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.97020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.97068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.97072: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882760.97084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.97098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882760.97106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882760.97113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882760.97120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882760.97131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882760.97148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882760.97154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882760.97161: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882760.97173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882760.97245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882760.97264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882760.97270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882760.97590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882760.99171: stdout chunk (state=3): >>>/root <<< 27844 1726882760.99334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882760.99337: stdout chunk (state=3): >>><<< 27844 1726882760.99347: stderr chunk (state=3): >>><<< 27844 1726882760.99371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882760.99382: _low_level_execute_command(): starting 27844 1726882760.99387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812 `" && echo ansible-tmp-1726882760.993685-28829-164192120981812="` echo /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812 `" ) && sleep 0' 27844 1726882761.01329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.01337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.01349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.01371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.01407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.01413: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.01423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.01436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.01443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.01449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.01457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.01476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.01487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.01495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.01502: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.01510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.01589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.01603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.01613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.01734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.03590: stdout chunk (state=3): >>>ansible-tmp-1726882760.993685-28829-164192120981812=/root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812 <<< 27844 1726882761.03771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.03774: stdout chunk (state=3): >>><<< 27844 1726882761.03779: stderr chunk (state=3): >>><<< 27844 1726882761.03803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882760.993685-28829-164192120981812=/root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.03835: variable 'ansible_module_compression' from source: unknown 27844 1726882761.03895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882761.03933: variable 'ansible_facts' from source: unknown 27844 1726882761.04012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/AnsiballZ_command.py 27844 1726882761.04415: Sending initial data 27844 1726882761.04419: Sent initial data (155 bytes) 27844 1726882761.06962: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.07025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.07043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.07058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.07100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.07107: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.07132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.07237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.07243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.07252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.07260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.07276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.07286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.07293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.07300: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.07308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.07389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.07461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.07473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.07607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.09382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882761.09467: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882761.09554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpqi5hrfyg /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/AnsiballZ_command.py <<< 27844 1726882761.09644: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882761.11337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.11342: stderr chunk (state=3): >>><<< 27844 1726882761.11345: stdout chunk (state=3): >>><<< 27844 1726882761.11372: done transferring module to remote 27844 1726882761.11385: _low_level_execute_command(): starting 27844 1726882761.11390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/ /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/AnsiballZ_command.py && sleep 0' 27844 1726882761.12956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.13100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.13111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.13126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.13169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.13173: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.13184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.13201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.13207: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.13216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.13225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.13238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.13250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.13257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.13268: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.13276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.13346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.13424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.13428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.13848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.15548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.15552: stdout chunk (state=3): >>><<< 27844 1726882761.15559: stderr chunk (state=3): >>><<< 27844 1726882761.15580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.15583: _low_level_execute_command(): starting 27844 1726882761.15589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/AnsiballZ_command.py && sleep 0' 27844 1726882761.17136: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.17143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.17187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.17194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.17208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.17214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.17494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.17497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.17509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.17630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.31045: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:39:21.305482", "end": "2024-09-20 21:39:21.308854", "delta": "0:00:00.003372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882761.32174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882761.32232: stderr chunk (state=3): >>><<< 27844 1726882761.32235: stdout chunk (state=3): >>><<< 27844 1726882761.32256: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:39:21.305482", "end": "2024-09-20 21:39:21.308854", "delta": "0:00:00.003372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882761.32304: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882761.32311: _low_level_execute_command(): starting 27844 1726882761.32317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882760.993685-28829-164192120981812/ > /dev/null 2>&1 && sleep 0' 27844 1726882761.33042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.33051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.33062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.33078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.33122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.33132: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.33142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.33155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.33163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.33171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.33180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.33189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.33207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.33214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.33220: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.33230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.33298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.33318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.33325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.33446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.35327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.35330: stdout chunk (state=3): >>><<< 27844 1726882761.35332: stderr chunk (state=3): >>><<< 27844 1726882761.35534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.35537: handler run complete 27844 1726882761.35539: Evaluated conditional (False): False 27844 1726882761.35542: attempt loop complete, returning result 27844 1726882761.35544: _execute() done 27844 1726882761.35546: dumping result to json 27844 1726882761.35548: done dumping result, returning 27844 1726882761.35550: done running TaskExecutor() for managed_node1/TASK: Get the IPv6 routes from the route table main [0e448fcc-3ce9-efa9-466a-000000000062] 27844 1726882761.35552: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000062 27844 1726882761.35653: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000062 27844 1726882761.35657: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003372", "end": "2024-09-20 21:39:21.308854", "rc": 0, "start": "2024-09-20 21:39:21.305482" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 27844 1726882761.35741: no more pending results, returning what we have 27844 1726882761.35744: results queue empty 27844 1726882761.35745: checking for any_errors_fatal 27844 1726882761.35754: done checking for any_errors_fatal 27844 1726882761.35755: checking for max_fail_percentage 27844 1726882761.35756: done checking for max_fail_percentage 27844 1726882761.35758: checking to see if all hosts have failed and the running result is not ok 27844 1726882761.35758: done checking to see if all hosts have failed 27844 1726882761.35759: getting the remaining hosts for this loop 27844 1726882761.35761: done getting the remaining hosts for this loop 27844 1726882761.35773: getting the next task for host managed_node1 27844 1726882761.35781: done getting next task for host managed_node1 27844 1726882761.35784: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 27844 1726882761.35786: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882761.35790: getting variables 27844 1726882761.35791: in VariableManager get_vars() 27844 1726882761.35829: Calling all_inventory to load vars for managed_node1 27844 1726882761.35832: Calling groups_inventory to load vars for managed_node1 27844 1726882761.35834: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882761.35845: Calling all_plugins_play to load vars for managed_node1 27844 1726882761.35848: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882761.35850: Calling groups_plugins_play to load vars for managed_node1 27844 1726882761.37318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882761.42354: done with get_vars() 27844 1726882761.42379: done getting variables 27844 1726882761.42559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Friday 20 September 2024 21:39:21 -0400 (0:00:00.490) 0:00:20.502 ****** 27844 1726882761.42592: entering _queue_task() for managed_node1/assert 27844 1726882761.43243: worker is 1 (out of 1 available) 27844 1726882761.43256: exiting _queue_task() for managed_node1/assert 27844 1726882761.43302: done queuing things up, now waiting for results queue to drain 27844 1726882761.43304: waiting for pending results... 27844 1726882761.43949: running TaskExecutor() for managed_node1/TASK: Assert that the route table main contains the specified IPv6 routes 27844 1726882761.44652: in run() - task 0e448fcc-3ce9-efa9-466a-000000000063 27844 1726882761.44678: variable 'ansible_search_path' from source: unknown 27844 1726882761.44721: calling self._execute() 27844 1726882761.44819: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.44831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.44846: variable 'omit' from source: magic vars 27844 1726882761.45217: variable 'ansible_distribution_major_version' from source: facts 27844 1726882761.45888: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882761.45902: variable 'omit' from source: magic vars 27844 1726882761.45931: variable 'omit' from source: magic vars 27844 1726882761.45980: variable 'omit' from source: magic vars 27844 1726882761.46028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882761.46075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882761.46101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882761.46123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882761.46140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882761.46179: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882761.46189: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.46197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.46308: Set connection var ansible_shell_type to sh 27844 1726882761.46317: Set connection var ansible_connection to ssh 27844 1726882761.46330: Set connection var ansible_pipelining to False 27844 1726882761.46341: Set connection var ansible_timeout to 10 27844 1726882761.46355: Set connection var ansible_shell_executable to /bin/sh 27844 1726882761.46371: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882761.46407: variable 'ansible_shell_executable' from source: unknown 27844 1726882761.46415: variable 'ansible_connection' from source: unknown 27844 1726882761.46423: variable 'ansible_module_compression' from source: unknown 27844 1726882761.46430: variable 'ansible_shell_type' from source: unknown 27844 1726882761.46436: variable 'ansible_shell_executable' from source: unknown 27844 1726882761.47078: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.47088: variable 'ansible_pipelining' from source: unknown 27844 1726882761.47095: variable 'ansible_timeout' from source: unknown 27844 1726882761.47102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.47233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882761.47248: variable 'omit' from source: magic vars 27844 1726882761.47257: starting attempt loop 27844 1726882761.47263: running the handler 27844 1726882761.47432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882761.47674: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882761.47717: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882761.47779: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882761.48007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882761.48102: variable 'route_table_main_ipv6' from source: set_fact 27844 1726882761.48141: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 27844 1726882761.48778: handler run complete 27844 1726882761.48797: attempt loop complete, returning result 27844 1726882761.48804: _execute() done 27844 1726882761.48810: dumping result to json 27844 1726882761.48816: done dumping result, returning 27844 1726882761.48829: done running TaskExecutor() for managed_node1/TASK: Assert that the route table main contains the specified IPv6 routes [0e448fcc-3ce9-efa9-466a-000000000063] 27844 1726882761.48837: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000063 27844 1726882761.48938: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000063 27844 1726882761.48945: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882761.49007: no more pending results, returning what we have 27844 1726882761.49011: results queue empty 27844 1726882761.49012: checking for any_errors_fatal 27844 1726882761.49025: done checking for any_errors_fatal 27844 1726882761.49025: checking for max_fail_percentage 27844 1726882761.49027: done checking for max_fail_percentage 27844 1726882761.49028: checking to see if all hosts have failed and the running result is not ok 27844 1726882761.49028: done checking to see if all hosts have failed 27844 1726882761.49029: getting the remaining hosts for this loop 27844 1726882761.49030: done getting the remaining hosts for this loop 27844 1726882761.49034: getting the next task for host managed_node1 27844 1726882761.49040: done getting next task for host managed_node1 27844 1726882761.49042: ^ task is: TASK: Get the interface1 MAC address 27844 1726882761.49044: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882761.49048: getting variables 27844 1726882761.49050: in VariableManager get_vars() 27844 1726882761.49095: Calling all_inventory to load vars for managed_node1 27844 1726882761.49098: Calling groups_inventory to load vars for managed_node1 27844 1726882761.49100: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882761.49112: Calling all_plugins_play to load vars for managed_node1 27844 1726882761.49115: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882761.49117: Calling groups_plugins_play to load vars for managed_node1 27844 1726882761.51533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882761.53385: done with get_vars() 27844 1726882761.53418: done getting variables 27844 1726882761.53483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Friday 20 September 2024 21:39:21 -0400 (0:00:00.109) 0:00:20.611 ****** 27844 1726882761.53516: entering _queue_task() for managed_node1/command 27844 1726882761.53841: worker is 1 (out of 1 available) 27844 1726882761.53858: exiting _queue_task() for managed_node1/command 27844 1726882761.53872: done queuing things up, now waiting for results queue to drain 27844 1726882761.53874: waiting for pending results... 27844 1726882761.54180: running TaskExecutor() for managed_node1/TASK: Get the interface1 MAC address 27844 1726882761.54268: in run() - task 0e448fcc-3ce9-efa9-466a-000000000064 27844 1726882761.54286: variable 'ansible_search_path' from source: unknown 27844 1726882761.54328: calling self._execute() 27844 1726882761.54449: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.54453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.54465: variable 'omit' from source: magic vars 27844 1726882761.54889: variable 'ansible_distribution_major_version' from source: facts 27844 1726882761.54901: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882761.54907: variable 'omit' from source: magic vars 27844 1726882761.54938: variable 'omit' from source: magic vars 27844 1726882761.55050: variable 'interface1' from source: play vars 27844 1726882761.55070: variable 'omit' from source: magic vars 27844 1726882761.55112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882761.55155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882761.55177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882761.55199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882761.55212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882761.55240: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882761.55249: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.55252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.55370: Set connection var ansible_shell_type to sh 27844 1726882761.55373: Set connection var ansible_connection to ssh 27844 1726882761.55379: Set connection var ansible_pipelining to False 27844 1726882761.55385: Set connection var ansible_timeout to 10 27844 1726882761.55391: Set connection var ansible_shell_executable to /bin/sh 27844 1726882761.55397: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882761.55428: variable 'ansible_shell_executable' from source: unknown 27844 1726882761.55431: variable 'ansible_connection' from source: unknown 27844 1726882761.55434: variable 'ansible_module_compression' from source: unknown 27844 1726882761.55436: variable 'ansible_shell_type' from source: unknown 27844 1726882761.55438: variable 'ansible_shell_executable' from source: unknown 27844 1726882761.55441: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.55446: variable 'ansible_pipelining' from source: unknown 27844 1726882761.55448: variable 'ansible_timeout' from source: unknown 27844 1726882761.55452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.55605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882761.55617: variable 'omit' from source: magic vars 27844 1726882761.55627: starting attempt loop 27844 1726882761.55630: running the handler 27844 1726882761.55672: _low_level_execute_command(): starting 27844 1726882761.55683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882761.56494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.56511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.56522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.56537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.56584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.56591: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.56601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.56618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.56626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.56633: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.56641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.56650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.56662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.56676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.56692: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.56702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.56781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.56806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.56819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.56955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.58614: stdout chunk (state=3): >>>/root <<< 27844 1726882761.58777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.58781: stdout chunk (state=3): >>><<< 27844 1726882761.58790: stderr chunk (state=3): >>><<< 27844 1726882761.58819: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.58830: _low_level_execute_command(): starting 27844 1726882761.58836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710 `" && echo ansible-tmp-1726882761.5881827-28847-186543019457710="` echo /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710 `" ) && sleep 0' 27844 1726882761.60932: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.60940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.60951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.60969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.61003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.61010: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.61030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.61044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.61139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.61147: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.61155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.61169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.61178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.61186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.61192: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.61202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.61284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.61362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.61377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.61503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.63402: stdout chunk (state=3): >>>ansible-tmp-1726882761.5881827-28847-186543019457710=/root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710 <<< 27844 1726882761.63571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.63574: stdout chunk (state=3): >>><<< 27844 1726882761.63594: stderr chunk (state=3): >>><<< 27844 1726882761.63610: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882761.5881827-28847-186543019457710=/root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.63640: variable 'ansible_module_compression' from source: unknown 27844 1726882761.63698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882761.63735: variable 'ansible_facts' from source: unknown 27844 1726882761.63809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/AnsiballZ_command.py 27844 1726882761.63948: Sending initial data 27844 1726882761.63951: Sent initial data (156 bytes) 27844 1726882761.65103: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.65113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.65123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.65135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.65183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.65189: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.65199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.65213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.65220: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.65227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.65235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.65244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.65268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.65279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.65286: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.65296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.65379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.65396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.65408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.65524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.67262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882761.67353: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882761.67449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp_h1b56j3 /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/AnsiballZ_command.py <<< 27844 1726882761.67539: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882761.68976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.69057: stderr chunk (state=3): >>><<< 27844 1726882761.69061: stdout chunk (state=3): >>><<< 27844 1726882761.69068: done transferring module to remote 27844 1726882761.69083: _low_level_execute_command(): starting 27844 1726882761.69088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/ /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/AnsiballZ_command.py && sleep 0' 27844 1726882761.70076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.70095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.70110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.70128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.70173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.70191: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.70211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.70228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.70238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.70247: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.70257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.70276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.70295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.70317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.70330: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.70346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.70435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.70456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.70483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.70609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.72345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.72403: stderr chunk (state=3): >>><<< 27844 1726882761.72405: stdout chunk (state=3): >>><<< 27844 1726882761.72457: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.72476: _low_level_execute_command(): starting 27844 1726882761.72480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/AnsiballZ_command.py && sleep 0' 27844 1726882761.73319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882761.73326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.73328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.73330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.73332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.73334: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882761.73336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.73338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882761.73339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882761.73341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882761.73343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.73345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.73347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.73350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882761.73352: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882761.73353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.73355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.73357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882761.73359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.73428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.86745: stdout chunk (state=3): >>> {"changed": true, "stdout": "8e:7f:0b:ff:ac:0e", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 21:39:21.863104", "end": "2024-09-20 21:39:21.866002", "delta": "0:00:00.002898", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882761.87884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882761.87928: stderr chunk (state=3): >>><<< 27844 1726882761.87931: stdout chunk (state=3): >>><<< 27844 1726882761.87944: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "8e:7f:0b:ff:ac:0e", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-20 21:39:21.863104", "end": "2024-09-20 21:39:21.866002", "delta": "0:00:00.002898", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882761.87977: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882761.87984: _low_level_execute_command(): starting 27844 1726882761.87989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882761.5881827-28847-186543019457710/ > /dev/null 2>&1 && sleep 0' 27844 1726882761.88393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882761.88399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882761.88451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.88454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882761.88457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882761.88459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882761.88461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882761.88511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882761.88515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882761.88612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882761.90387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882761.90426: stderr chunk (state=3): >>><<< 27844 1726882761.90430: stdout chunk (state=3): >>><<< 27844 1726882761.90441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882761.90448: handler run complete 27844 1726882761.90470: Evaluated conditional (False): False 27844 1726882761.90481: attempt loop complete, returning result 27844 1726882761.90484: _execute() done 27844 1726882761.90486: dumping result to json 27844 1726882761.90490: done dumping result, returning 27844 1726882761.90497: done running TaskExecutor() for managed_node1/TASK: Get the interface1 MAC address [0e448fcc-3ce9-efa9-466a-000000000064] 27844 1726882761.90501: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000064 27844 1726882761.90598: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000064 27844 1726882761.90601: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.002898", "end": "2024-09-20 21:39:21.866002", "rc": 0, "start": "2024-09-20 21:39:21.863104" } STDOUT: 8e:7f:0b:ff:ac:0e 27844 1726882761.90668: no more pending results, returning what we have 27844 1726882761.90672: results queue empty 27844 1726882761.90673: checking for any_errors_fatal 27844 1726882761.90680: done checking for any_errors_fatal 27844 1726882761.90681: checking for max_fail_percentage 27844 1726882761.90683: done checking for max_fail_percentage 27844 1726882761.90684: checking to see if all hosts have failed and the running result is not ok 27844 1726882761.90685: done checking to see if all hosts have failed 27844 1726882761.90685: getting the remaining hosts for this loop 27844 1726882761.90687: done getting the remaining hosts for this loop 27844 1726882761.90690: getting the next task for host managed_node1 27844 1726882761.90697: done getting next task for host managed_node1 27844 1726882761.90703: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882761.90706: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882761.90723: getting variables 27844 1726882761.90724: in VariableManager get_vars() 27844 1726882761.90761: Calling all_inventory to load vars for managed_node1 27844 1726882761.90766: Calling groups_inventory to load vars for managed_node1 27844 1726882761.90774: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882761.90784: Calling all_plugins_play to load vars for managed_node1 27844 1726882761.90786: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882761.90789: Calling groups_plugins_play to load vars for managed_node1 27844 1726882761.91762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882761.92810: done with get_vars() 27844 1726882761.92832: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:21 -0400 (0:00:00.394) 0:00:21.006 ****** 27844 1726882761.92942: entering _queue_task() for managed_node1/include_tasks 27844 1726882761.93227: worker is 1 (out of 1 available) 27844 1726882761.93239: exiting _queue_task() for managed_node1/include_tasks 27844 1726882761.93250: done queuing things up, now waiting for results queue to drain 27844 1726882761.93251: waiting for pending results... 27844 1726882761.93667: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882761.93770: in run() - task 0e448fcc-3ce9-efa9-466a-00000000006c 27844 1726882761.93780: variable 'ansible_search_path' from source: unknown 27844 1726882761.93784: variable 'ansible_search_path' from source: unknown 27844 1726882761.93815: calling self._execute() 27844 1726882761.93893: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882761.93897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882761.93904: variable 'omit' from source: magic vars 27844 1726882761.94186: variable 'ansible_distribution_major_version' from source: facts 27844 1726882761.94196: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882761.94202: _execute() done 27844 1726882761.94205: dumping result to json 27844 1726882761.94208: done dumping result, returning 27844 1726882761.94215: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-efa9-466a-00000000006c] 27844 1726882761.94220: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006c 27844 1726882761.94310: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006c 27844 1726882761.94314: WORKER PROCESS EXITING 27844 1726882761.94353: no more pending results, returning what we have 27844 1726882761.94358: in VariableManager get_vars() 27844 1726882761.94404: Calling all_inventory to load vars for managed_node1 27844 1726882761.94407: Calling groups_inventory to load vars for managed_node1 27844 1726882761.94409: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882761.94418: Calling all_plugins_play to load vars for managed_node1 27844 1726882761.94420: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882761.94422: Calling groups_plugins_play to load vars for managed_node1 27844 1726882761.95191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882761.96677: done with get_vars() 27844 1726882761.96698: variable 'ansible_search_path' from source: unknown 27844 1726882761.96700: variable 'ansible_search_path' from source: unknown 27844 1726882761.96737: we have included files to process 27844 1726882761.96739: generating all_blocks data 27844 1726882761.96741: done generating all_blocks data 27844 1726882761.96746: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882761.96748: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882761.96755: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882761.97339: done processing included file 27844 1726882761.97342: iterating over new_blocks loaded from include file 27844 1726882761.97343: in VariableManager get_vars() 27844 1726882761.97368: done with get_vars() 27844 1726882761.97370: filtering new block on tags 27844 1726882761.97387: done filtering new block on tags 27844 1726882761.97390: in VariableManager get_vars() 27844 1726882761.97417: done with get_vars() 27844 1726882761.97420: filtering new block on tags 27844 1726882761.97441: done filtering new block on tags 27844 1726882761.97443: in VariableManager get_vars() 27844 1726882761.97468: done with get_vars() 27844 1726882761.97470: filtering new block on tags 27844 1726882761.97489: done filtering new block on tags 27844 1726882761.97491: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 27844 1726882761.97495: extending task lists for all hosts with included blocks 27844 1726882761.98350: done extending task lists 27844 1726882761.98352: done processing included files 27844 1726882761.98353: results queue empty 27844 1726882761.98353: checking for any_errors_fatal 27844 1726882761.98358: done checking for any_errors_fatal 27844 1726882761.98359: checking for max_fail_percentage 27844 1726882761.98360: done checking for max_fail_percentage 27844 1726882761.98361: checking to see if all hosts have failed and the running result is not ok 27844 1726882761.98362: done checking to see if all hosts have failed 27844 1726882761.98362: getting the remaining hosts for this loop 27844 1726882761.98365: done getting the remaining hosts for this loop 27844 1726882761.98368: getting the next task for host managed_node1 27844 1726882761.98371: done getting next task for host managed_node1 27844 1726882761.98377: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882761.98381: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882761.98390: getting variables 27844 1726882761.98391: in VariableManager get_vars() 27844 1726882761.98406: Calling all_inventory to load vars for managed_node1 27844 1726882761.98408: Calling groups_inventory to load vars for managed_node1 27844 1726882761.98410: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882761.98415: Calling all_plugins_play to load vars for managed_node1 27844 1726882761.98417: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882761.98420: Calling groups_plugins_play to load vars for managed_node1 27844 1726882761.99739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882762.01488: done with get_vars() 27844 1726882762.01508: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:22 -0400 (0:00:00.086) 0:00:21.092 ****** 27844 1726882762.01586: entering _queue_task() for managed_node1/setup 27844 1726882762.01893: worker is 1 (out of 1 available) 27844 1726882762.01904: exiting _queue_task() for managed_node1/setup 27844 1726882762.01915: done queuing things up, now waiting for results queue to drain 27844 1726882762.01917: waiting for pending results... 27844 1726882762.02212: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882762.02352: in run() - task 0e448fcc-3ce9-efa9-466a-000000000563 27844 1726882762.02372: variable 'ansible_search_path' from source: unknown 27844 1726882762.02376: variable 'ansible_search_path' from source: unknown 27844 1726882762.02416: calling self._execute() 27844 1726882762.02514: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.02517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.02529: variable 'omit' from source: magic vars 27844 1726882762.02897: variable 'ansible_distribution_major_version' from source: facts 27844 1726882762.02915: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882762.03123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882762.05521: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882762.05601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882762.05645: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882762.05682: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882762.05708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882762.05792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882762.05818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882762.05845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882762.05890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882762.05902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882762.05952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882762.05980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882762.06002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882762.06038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882762.06055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882762.06283: variable '__network_required_facts' from source: role '' defaults 27844 1726882762.06295: variable 'ansible_facts' from source: unknown 27844 1726882762.07230: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27844 1726882762.07236: when evaluation is False, skipping this task 27844 1726882762.07240: _execute() done 27844 1726882762.07243: dumping result to json 27844 1726882762.07253: done dumping result, returning 27844 1726882762.07261: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-efa9-466a-000000000563] 27844 1726882762.07271: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000563 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882762.07413: no more pending results, returning what we have 27844 1726882762.07418: results queue empty 27844 1726882762.07419: checking for any_errors_fatal 27844 1726882762.07421: done checking for any_errors_fatal 27844 1726882762.07422: checking for max_fail_percentage 27844 1726882762.07423: done checking for max_fail_percentage 27844 1726882762.07424: checking to see if all hosts have failed and the running result is not ok 27844 1726882762.07425: done checking to see if all hosts have failed 27844 1726882762.07426: getting the remaining hosts for this loop 27844 1726882762.07428: done getting the remaining hosts for this loop 27844 1726882762.07432: getting the next task for host managed_node1 27844 1726882762.07441: done getting next task for host managed_node1 27844 1726882762.07445: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882762.07450: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882762.07475: getting variables 27844 1726882762.07478: in VariableManager get_vars() 27844 1726882762.07524: Calling all_inventory to load vars for managed_node1 27844 1726882762.07527: Calling groups_inventory to load vars for managed_node1 27844 1726882762.07530: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882762.07543: Calling all_plugins_play to load vars for managed_node1 27844 1726882762.07546: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882762.07550: Calling groups_plugins_play to load vars for managed_node1 27844 1726882762.08071: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000563 27844 1726882762.08075: WORKER PROCESS EXITING 27844 1726882762.10356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882762.11307: done with get_vars() 27844 1726882762.11323: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:22 -0400 (0:00:00.098) 0:00:21.190 ****** 27844 1726882762.11400: entering _queue_task() for managed_node1/stat 27844 1726882762.11628: worker is 1 (out of 1 available) 27844 1726882762.11642: exiting _queue_task() for managed_node1/stat 27844 1726882762.11653: done queuing things up, now waiting for results queue to drain 27844 1726882762.11655: waiting for pending results... 27844 1726882762.11890: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882762.12125: in run() - task 0e448fcc-3ce9-efa9-466a-000000000565 27844 1726882762.12129: variable 'ansible_search_path' from source: unknown 27844 1726882762.12132: variable 'ansible_search_path' from source: unknown 27844 1726882762.12135: calling self._execute() 27844 1726882762.13176: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.13221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.13243: variable 'omit' from source: magic vars 27844 1726882762.13631: variable 'ansible_distribution_major_version' from source: facts 27844 1726882762.13648: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882762.13827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882762.14114: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882762.14159: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882762.14204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882762.14247: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882762.14346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882762.14380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882762.14410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882762.14449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882762.14551: variable '__network_is_ostree' from source: set_fact 27844 1726882762.14561: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882762.14573: when evaluation is False, skipping this task 27844 1726882762.14579: _execute() done 27844 1726882762.14586: dumping result to json 27844 1726882762.14593: done dumping result, returning 27844 1726882762.14602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-efa9-466a-000000000565] 27844 1726882762.14611: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000565 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882762.14755: no more pending results, returning what we have 27844 1726882762.14760: results queue empty 27844 1726882762.14761: checking for any_errors_fatal 27844 1726882762.14774: done checking for any_errors_fatal 27844 1726882762.14775: checking for max_fail_percentage 27844 1726882762.14777: done checking for max_fail_percentage 27844 1726882762.14778: checking to see if all hosts have failed and the running result is not ok 27844 1726882762.14779: done checking to see if all hosts have failed 27844 1726882762.14780: getting the remaining hosts for this loop 27844 1726882762.14782: done getting the remaining hosts for this loop 27844 1726882762.14785: getting the next task for host managed_node1 27844 1726882762.14792: done getting next task for host managed_node1 27844 1726882762.14796: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882762.14800: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882762.14818: getting variables 27844 1726882762.14820: in VariableManager get_vars() 27844 1726882762.14857: Calling all_inventory to load vars for managed_node1 27844 1726882762.14860: Calling groups_inventory to load vars for managed_node1 27844 1726882762.14863: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882762.14879: Calling all_plugins_play to load vars for managed_node1 27844 1726882762.14882: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882762.14885: Calling groups_plugins_play to load vars for managed_node1 27844 1726882762.16304: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000565 27844 1726882762.16307: WORKER PROCESS EXITING 27844 1726882762.17126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882762.19967: done with get_vars() 27844 1726882762.19990: done getting variables 27844 1726882762.20052: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:22 -0400 (0:00:00.086) 0:00:21.277 ****** 27844 1726882762.20095: entering _queue_task() for managed_node1/set_fact 27844 1726882762.20383: worker is 1 (out of 1 available) 27844 1726882762.20394: exiting _queue_task() for managed_node1/set_fact 27844 1726882762.20405: done queuing things up, now waiting for results queue to drain 27844 1726882762.20407: waiting for pending results... 27844 1726882762.20696: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882762.20893: in run() - task 0e448fcc-3ce9-efa9-466a-000000000566 27844 1726882762.20912: variable 'ansible_search_path' from source: unknown 27844 1726882762.20919: variable 'ansible_search_path' from source: unknown 27844 1726882762.20988: calling self._execute() 27844 1726882762.21116: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.21143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.21168: variable 'omit' from source: magic vars 27844 1726882762.22402: variable 'ansible_distribution_major_version' from source: facts 27844 1726882762.22420: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882762.22720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882762.23019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882762.23071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882762.23119: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882762.23157: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882762.23286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882762.23316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882762.23354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882762.23392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882762.23496: variable '__network_is_ostree' from source: set_fact 27844 1726882762.23508: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882762.23515: when evaluation is False, skipping this task 27844 1726882762.23521: _execute() done 27844 1726882762.23528: dumping result to json 27844 1726882762.23537: done dumping result, returning 27844 1726882762.23556: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-efa9-466a-000000000566] 27844 1726882762.23570: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000566 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882762.23717: no more pending results, returning what we have 27844 1726882762.23722: results queue empty 27844 1726882762.23724: checking for any_errors_fatal 27844 1726882762.23732: done checking for any_errors_fatal 27844 1726882762.23733: checking for max_fail_percentage 27844 1726882762.23734: done checking for max_fail_percentage 27844 1726882762.23735: checking to see if all hosts have failed and the running result is not ok 27844 1726882762.23736: done checking to see if all hosts have failed 27844 1726882762.23737: getting the remaining hosts for this loop 27844 1726882762.23739: done getting the remaining hosts for this loop 27844 1726882762.23742: getting the next task for host managed_node1 27844 1726882762.23752: done getting next task for host managed_node1 27844 1726882762.23755: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882762.23759: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882762.23785: getting variables 27844 1726882762.23788: in VariableManager get_vars() 27844 1726882762.23828: Calling all_inventory to load vars for managed_node1 27844 1726882762.23832: Calling groups_inventory to load vars for managed_node1 27844 1726882762.23834: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882762.23846: Calling all_plugins_play to load vars for managed_node1 27844 1726882762.23849: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882762.23852: Calling groups_plugins_play to load vars for managed_node1 27844 1726882762.24813: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000566 27844 1726882762.24817: WORKER PROCESS EXITING 27844 1726882762.25938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882762.27895: done with get_vars() 27844 1726882762.27917: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:22 -0400 (0:00:00.079) 0:00:21.357 ****** 27844 1726882762.28106: entering _queue_task() for managed_node1/service_facts 27844 1726882762.28381: worker is 1 (out of 1 available) 27844 1726882762.28393: exiting _queue_task() for managed_node1/service_facts 27844 1726882762.28406: done queuing things up, now waiting for results queue to drain 27844 1726882762.28407: waiting for pending results... 27844 1726882762.28691: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882762.28854: in run() - task 0e448fcc-3ce9-efa9-466a-000000000568 27844 1726882762.28882: variable 'ansible_search_path' from source: unknown 27844 1726882762.28889: variable 'ansible_search_path' from source: unknown 27844 1726882762.28925: calling self._execute() 27844 1726882762.29045: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.29056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.29110: variable 'omit' from source: magic vars 27844 1726882762.29510: variable 'ansible_distribution_major_version' from source: facts 27844 1726882762.29532: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882762.29549: variable 'omit' from source: magic vars 27844 1726882762.29630: variable 'omit' from source: magic vars 27844 1726882762.29675: variable 'omit' from source: magic vars 27844 1726882762.29720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882762.29760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882762.29788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882762.29808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882762.29827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882762.29858: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882762.29879: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.29887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.30000: Set connection var ansible_shell_type to sh 27844 1726882762.30008: Set connection var ansible_connection to ssh 27844 1726882762.30018: Set connection var ansible_pipelining to False 27844 1726882762.30028: Set connection var ansible_timeout to 10 27844 1726882762.30043: Set connection var ansible_shell_executable to /bin/sh 27844 1726882762.30053: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882762.30092: variable 'ansible_shell_executable' from source: unknown 27844 1726882762.30099: variable 'ansible_connection' from source: unknown 27844 1726882762.30105: variable 'ansible_module_compression' from source: unknown 27844 1726882762.30109: variable 'ansible_shell_type' from source: unknown 27844 1726882762.30114: variable 'ansible_shell_executable' from source: unknown 27844 1726882762.30119: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882762.30124: variable 'ansible_pipelining' from source: unknown 27844 1726882762.30129: variable 'ansible_timeout' from source: unknown 27844 1726882762.30134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882762.30342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882762.30359: variable 'omit' from source: magic vars 27844 1726882762.30379: starting attempt loop 27844 1726882762.30386: running the handler 27844 1726882762.30403: _low_level_execute_command(): starting 27844 1726882762.30419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882762.31211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882762.31227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.31246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.31271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.31318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.31330: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882762.31344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.31371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882762.31386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882762.31402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882762.31417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.31432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.31449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.31470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.31485: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882762.31500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.31583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882762.31600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882762.31615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882762.31753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882762.33414: stdout chunk (state=3): >>>/root <<< 27844 1726882762.33593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882762.33596: stdout chunk (state=3): >>><<< 27844 1726882762.33599: stderr chunk (state=3): >>><<< 27844 1726882762.33705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882762.33711: _low_level_execute_command(): starting 27844 1726882762.33714: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973 `" && echo ansible-tmp-1726882762.3361902-28887-97313835522973="` echo /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973 `" ) && sleep 0' 27844 1726882762.34302: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882762.34315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.34328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.34344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.34395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.34406: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882762.34418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.34434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882762.34445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882762.34454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882762.34470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.34491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.34507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.34518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.34528: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882762.34539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.34625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882762.34646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882762.34660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882762.34794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882762.36673: stdout chunk (state=3): >>>ansible-tmp-1726882762.3361902-28887-97313835522973=/root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973 <<< 27844 1726882762.36850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882762.36853: stdout chunk (state=3): >>><<< 27844 1726882762.36855: stderr chunk (state=3): >>><<< 27844 1726882762.36975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882762.3361902-28887-97313835522973=/root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882762.36978: variable 'ansible_module_compression' from source: unknown 27844 1726882762.36981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27844 1726882762.37083: variable 'ansible_facts' from source: unknown 27844 1726882762.37113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/AnsiballZ_service_facts.py 27844 1726882762.37270: Sending initial data 27844 1726882762.37273: Sent initial data (161 bytes) 27844 1726882762.38286: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882762.38301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.38314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.38332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.38379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.38396: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882762.38410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.38426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882762.38436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882762.38446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882762.38456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.38473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.38489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.38505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882762.38516: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882762.38528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.38611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882762.38636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882762.38651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882762.38779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882762.40511: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882762.40611: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882762.40706: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp5nno4qnh /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/AnsiballZ_service_facts.py <<< 27844 1726882762.40793: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882762.41903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882762.41995: stderr chunk (state=3): >>><<< 27844 1726882762.41998: stdout chunk (state=3): >>><<< 27844 1726882762.42014: done transferring module to remote 27844 1726882762.42025: _low_level_execute_command(): starting 27844 1726882762.42030: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/ /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/AnsiballZ_service_facts.py && sleep 0' 27844 1726882762.42468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.42472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.42514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.42518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.42520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.42572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882762.42593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882762.42599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882762.42694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882762.44418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882762.44460: stderr chunk (state=3): >>><<< 27844 1726882762.44463: stdout chunk (state=3): >>><<< 27844 1726882762.44476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882762.44479: _low_level_execute_command(): starting 27844 1726882762.44483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/AnsiballZ_service_facts.py && sleep 0' 27844 1726882762.44933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882762.44936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882762.44939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882762.44940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882762.44972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.44986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882762.45034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882762.45038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882762.45141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882763.76208: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 27844 1726882763.76216: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 27844 1726882763.76241: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 27844 1726882763.76258: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 27844 1726882763.76278: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 27844 1726882763.76281: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27844 1726882763.77640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882763.77643: stdout chunk (state=3): >>><<< 27844 1726882763.77646: stderr chunk (state=3): >>><<< 27844 1726882763.77974: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882763.78402: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882763.78416: _low_level_execute_command(): starting 27844 1726882763.78431: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882762.3361902-28887-97313835522973/ > /dev/null 2>&1 && sleep 0' 27844 1726882763.79097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882763.79111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.79124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.79140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.79187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.79203: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882763.79218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.79234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882763.79244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882763.79254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882763.79267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.79280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.79294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.79310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.79326: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882763.79343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.79429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882763.79453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882763.79473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882763.79604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882763.81449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882763.81453: stdout chunk (state=3): >>><<< 27844 1726882763.81455: stderr chunk (state=3): >>><<< 27844 1726882763.81729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882763.81733: handler run complete 27844 1726882763.81735: variable 'ansible_facts' from source: unknown 27844 1726882763.81787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882763.82054: variable 'ansible_facts' from source: unknown 27844 1726882763.82128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882763.82233: attempt loop complete, returning result 27844 1726882763.82236: _execute() done 27844 1726882763.82238: dumping result to json 27844 1726882763.82276: done dumping result, returning 27844 1726882763.82284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-efa9-466a-000000000568] 27844 1726882763.82288: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000568 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882763.82978: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000568 27844 1726882763.82981: WORKER PROCESS EXITING 27844 1726882763.82991: no more pending results, returning what we have 27844 1726882763.82993: results queue empty 27844 1726882763.82994: checking for any_errors_fatal 27844 1726882763.82997: done checking for any_errors_fatal 27844 1726882763.82998: checking for max_fail_percentage 27844 1726882763.82999: done checking for max_fail_percentage 27844 1726882763.82999: checking to see if all hosts have failed and the running result is not ok 27844 1726882763.83000: done checking to see if all hosts have failed 27844 1726882763.83000: getting the remaining hosts for this loop 27844 1726882763.83001: done getting the remaining hosts for this loop 27844 1726882763.83004: getting the next task for host managed_node1 27844 1726882763.83008: done getting next task for host managed_node1 27844 1726882763.83010: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882763.83013: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882763.83019: getting variables 27844 1726882763.83020: in VariableManager get_vars() 27844 1726882763.83044: Calling all_inventory to load vars for managed_node1 27844 1726882763.83046: Calling groups_inventory to load vars for managed_node1 27844 1726882763.83047: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882763.83053: Calling all_plugins_play to load vars for managed_node1 27844 1726882763.83055: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882763.83060: Calling groups_plugins_play to load vars for managed_node1 27844 1726882763.84133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882763.85389: done with get_vars() 27844 1726882763.85406: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:23 -0400 (0:00:01.573) 0:00:22.931 ****** 27844 1726882763.85480: entering _queue_task() for managed_node1/package_facts 27844 1726882763.85682: worker is 1 (out of 1 available) 27844 1726882763.85693: exiting _queue_task() for managed_node1/package_facts 27844 1726882763.85706: done queuing things up, now waiting for results queue to drain 27844 1726882763.85708: waiting for pending results... 27844 1726882763.85884: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882763.85977: in run() - task 0e448fcc-3ce9-efa9-466a-000000000569 27844 1726882763.85988: variable 'ansible_search_path' from source: unknown 27844 1726882763.85991: variable 'ansible_search_path' from source: unknown 27844 1726882763.86018: calling self._execute() 27844 1726882763.86092: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882763.86096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882763.86105: variable 'omit' from source: magic vars 27844 1726882763.86364: variable 'ansible_distribution_major_version' from source: facts 27844 1726882763.86376: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882763.86379: variable 'omit' from source: magic vars 27844 1726882763.86421: variable 'omit' from source: magic vars 27844 1726882763.86443: variable 'omit' from source: magic vars 27844 1726882763.86478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882763.86503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882763.86518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882763.86530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882763.86540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882763.86561: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882763.86568: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882763.86571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882763.86637: Set connection var ansible_shell_type to sh 27844 1726882763.86641: Set connection var ansible_connection to ssh 27844 1726882763.86644: Set connection var ansible_pipelining to False 27844 1726882763.86650: Set connection var ansible_timeout to 10 27844 1726882763.86655: Set connection var ansible_shell_executable to /bin/sh 27844 1726882763.86660: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882763.86684: variable 'ansible_shell_executable' from source: unknown 27844 1726882763.86687: variable 'ansible_connection' from source: unknown 27844 1726882763.86690: variable 'ansible_module_compression' from source: unknown 27844 1726882763.86693: variable 'ansible_shell_type' from source: unknown 27844 1726882763.86697: variable 'ansible_shell_executable' from source: unknown 27844 1726882763.86699: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882763.86701: variable 'ansible_pipelining' from source: unknown 27844 1726882763.86703: variable 'ansible_timeout' from source: unknown 27844 1726882763.86705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882763.86852: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882763.86860: variable 'omit' from source: magic vars 27844 1726882763.86867: starting attempt loop 27844 1726882763.86872: running the handler 27844 1726882763.86884: _low_level_execute_command(): starting 27844 1726882763.86891: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882763.87702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882763.87827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882763.89412: stdout chunk (state=3): >>>/root <<< 27844 1726882763.89512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882763.89582: stderr chunk (state=3): >>><<< 27844 1726882763.89585: stdout chunk (state=3): >>><<< 27844 1726882763.89687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882763.89690: _low_level_execute_command(): starting 27844 1726882763.89693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942 `" && echo ansible-tmp-1726882763.8960426-28962-88403679660942="` echo /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942 `" ) && sleep 0' 27844 1726882763.90268: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882763.90285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.90300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.90319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.90369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.90382: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882763.90397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.90415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882763.90434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882763.90445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882763.90459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.90483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.90500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.90514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.90526: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882763.90541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.90626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882763.90647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882763.90667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882763.90798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882763.92633: stdout chunk (state=3): >>>ansible-tmp-1726882763.8960426-28962-88403679660942=/root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942 <<< 27844 1726882763.92751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882763.92831: stderr chunk (state=3): >>><<< 27844 1726882763.92842: stdout chunk (state=3): >>><<< 27844 1726882763.93076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882763.8960426-28962-88403679660942=/root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882763.93079: variable 'ansible_module_compression' from source: unknown 27844 1726882763.93082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27844 1726882763.93084: variable 'ansible_facts' from source: unknown 27844 1726882763.93234: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/AnsiballZ_package_facts.py 27844 1726882763.93402: Sending initial data 27844 1726882763.93405: Sent initial data (161 bytes) 27844 1726882763.94355: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882763.94373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.94393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.94418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.94459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.94475: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882763.94490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.94517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882763.94528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882763.94539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882763.94548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.94559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.94575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882763.94585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882763.94593: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882763.94606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882763.94683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882763.94704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882763.94723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882763.94856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882763.96590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882763.96677: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882763.96770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpgo6efc8m /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/AnsiballZ_package_facts.py <<< 27844 1726882763.96856: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882763.99357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882763.99476: stderr chunk (state=3): >>><<< 27844 1726882763.99480: stdout chunk (state=3): >>><<< 27844 1726882763.99495: done transferring module to remote 27844 1726882763.99506: _low_level_execute_command(): starting 27844 1726882763.99512: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/ /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/AnsiballZ_package_facts.py && sleep 0' 27844 1726882763.99946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882763.99953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882763.99962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882763.99975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882764.00003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882764.00010: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882764.00018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.00027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882764.00035: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882764.00042: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882764.00045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882764.00057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882764.00067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882764.00076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.00121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882764.00142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882764.00145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882764.00243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882764.02017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882764.02075: stderr chunk (state=3): >>><<< 27844 1726882764.02078: stdout chunk (state=3): >>><<< 27844 1726882764.02092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882764.02095: _low_level_execute_command(): starting 27844 1726882764.02101: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/AnsiballZ_package_facts.py && sleep 0' 27844 1726882764.02732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882764.02741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882764.02751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882764.02767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882764.02811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882764.02821: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882764.02831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.02844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882764.02851: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882764.02858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882764.02869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882764.02881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882764.02893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882764.02900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882764.02908: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882764.02925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.03002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882764.03020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882764.03038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882764.03169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882764.49197: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 27844 1726882764.49230: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 27844 1726882764.49247: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 27844 1726882764.49259: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 27844 1726882764.49268: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 27844 1726882764.49276: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 27844 1726882764.49287: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 27844 1726882764.49309: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 27844 1726882764.49326: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 27844 1726882764.49332: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 27844 1726882764.49360: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 27844 1726882764.49379: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 27844 1726882764.49389: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 27844 1726882764.49404: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 27844 1726882764.49424: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 27844 1726882764.49434: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27844 1726882764.50961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882764.50967: stdout chunk (state=3): >>><<< 27844 1726882764.50969: stderr chunk (state=3): >>><<< 27844 1726882764.51078: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882764.53226: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882764.53243: _low_level_execute_command(): starting 27844 1726882764.53246: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882763.8960426-28962-88403679660942/ > /dev/null 2>&1 && sleep 0' 27844 1726882764.53672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882764.53680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882764.53709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.53722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882764.53772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882764.53785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882764.53889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882764.55738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882764.55768: stderr chunk (state=3): >>><<< 27844 1726882764.55774: stdout chunk (state=3): >>><<< 27844 1726882764.55787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882764.55792: handler run complete 27844 1726882764.56527: variable 'ansible_facts' from source: unknown 27844 1726882764.57011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.58388: variable 'ansible_facts' from source: unknown 27844 1726882764.58646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.59090: attempt loop complete, returning result 27844 1726882764.59099: _execute() done 27844 1726882764.59102: dumping result to json 27844 1726882764.59336: done dumping result, returning 27844 1726882764.59339: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-efa9-466a-000000000569] 27844 1726882764.59342: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000569 27844 1726882764.61057: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000569 27844 1726882764.61060: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882764.61146: no more pending results, returning what we have 27844 1726882764.61151: results queue empty 27844 1726882764.61152: checking for any_errors_fatal 27844 1726882764.61157: done checking for any_errors_fatal 27844 1726882764.61158: checking for max_fail_percentage 27844 1726882764.61159: done checking for max_fail_percentage 27844 1726882764.61159: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.61160: done checking to see if all hosts have failed 27844 1726882764.61161: getting the remaining hosts for this loop 27844 1726882764.61161: done getting the remaining hosts for this loop 27844 1726882764.61168: getting the next task for host managed_node1 27844 1726882764.61173: done getting next task for host managed_node1 27844 1726882764.61175: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882764.61178: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.61185: getting variables 27844 1726882764.61186: in VariableManager get_vars() 27844 1726882764.61211: Calling all_inventory to load vars for managed_node1 27844 1726882764.61212: Calling groups_inventory to load vars for managed_node1 27844 1726882764.61214: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.61220: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.61222: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.61223: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.61956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.62881: done with get_vars() 27844 1726882764.62898: done getting variables 27844 1726882764.62940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:24 -0400 (0:00:00.774) 0:00:23.706 ****** 27844 1726882764.62968: entering _queue_task() for managed_node1/debug 27844 1726882764.63173: worker is 1 (out of 1 available) 27844 1726882764.63187: exiting _queue_task() for managed_node1/debug 27844 1726882764.63199: done queuing things up, now waiting for results queue to drain 27844 1726882764.63201: waiting for pending results... 27844 1726882764.63375: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882764.63454: in run() - task 0e448fcc-3ce9-efa9-466a-00000000006d 27844 1726882764.63469: variable 'ansible_search_path' from source: unknown 27844 1726882764.63473: variable 'ansible_search_path' from source: unknown 27844 1726882764.63501: calling self._execute() 27844 1726882764.63576: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.63580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.63588: variable 'omit' from source: magic vars 27844 1726882764.63864: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.63879: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.63885: variable 'omit' from source: magic vars 27844 1726882764.63923: variable 'omit' from source: magic vars 27844 1726882764.63993: variable 'network_provider' from source: set_fact 27844 1726882764.64007: variable 'omit' from source: magic vars 27844 1726882764.64038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882764.64062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882764.64082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882764.64097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882764.64106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882764.64127: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882764.64130: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.64133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.64204: Set connection var ansible_shell_type to sh 27844 1726882764.64207: Set connection var ansible_connection to ssh 27844 1726882764.64211: Set connection var ansible_pipelining to False 27844 1726882764.64217: Set connection var ansible_timeout to 10 27844 1726882764.64222: Set connection var ansible_shell_executable to /bin/sh 27844 1726882764.64227: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882764.64246: variable 'ansible_shell_executable' from source: unknown 27844 1726882764.64249: variable 'ansible_connection' from source: unknown 27844 1726882764.64252: variable 'ansible_module_compression' from source: unknown 27844 1726882764.64254: variable 'ansible_shell_type' from source: unknown 27844 1726882764.64256: variable 'ansible_shell_executable' from source: unknown 27844 1726882764.64259: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.64261: variable 'ansible_pipelining' from source: unknown 27844 1726882764.64268: variable 'ansible_timeout' from source: unknown 27844 1726882764.64271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.64370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882764.64384: variable 'omit' from source: magic vars 27844 1726882764.64387: starting attempt loop 27844 1726882764.64390: running the handler 27844 1726882764.64426: handler run complete 27844 1726882764.64438: attempt loop complete, returning result 27844 1726882764.64440: _execute() done 27844 1726882764.64443: dumping result to json 27844 1726882764.64445: done dumping result, returning 27844 1726882764.64452: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-efa9-466a-00000000006d] 27844 1726882764.64456: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006d 27844 1726882764.64540: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006d 27844 1726882764.64543: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 27844 1726882764.64602: no more pending results, returning what we have 27844 1726882764.64605: results queue empty 27844 1726882764.64606: checking for any_errors_fatal 27844 1726882764.64613: done checking for any_errors_fatal 27844 1726882764.64614: checking for max_fail_percentage 27844 1726882764.64615: done checking for max_fail_percentage 27844 1726882764.64616: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.64621: done checking to see if all hosts have failed 27844 1726882764.64622: getting the remaining hosts for this loop 27844 1726882764.64623: done getting the remaining hosts for this loop 27844 1726882764.64626: getting the next task for host managed_node1 27844 1726882764.64635: done getting next task for host managed_node1 27844 1726882764.64638: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882764.64641: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.64651: getting variables 27844 1726882764.64652: in VariableManager get_vars() 27844 1726882764.64692: Calling all_inventory to load vars for managed_node1 27844 1726882764.64694: Calling groups_inventory to load vars for managed_node1 27844 1726882764.64696: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.64702: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.64704: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.64706: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.65452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.66459: done with get_vars() 27844 1726882764.66478: done getting variables 27844 1726882764.66516: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:24 -0400 (0:00:00.035) 0:00:23.741 ****** 27844 1726882764.66538: entering _queue_task() for managed_node1/fail 27844 1726882764.66718: worker is 1 (out of 1 available) 27844 1726882764.66731: exiting _queue_task() for managed_node1/fail 27844 1726882764.66743: done queuing things up, now waiting for results queue to drain 27844 1726882764.66744: waiting for pending results... 27844 1726882764.66915: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882764.66994: in run() - task 0e448fcc-3ce9-efa9-466a-00000000006e 27844 1726882764.67004: variable 'ansible_search_path' from source: unknown 27844 1726882764.67012: variable 'ansible_search_path' from source: unknown 27844 1726882764.67045: calling self._execute() 27844 1726882764.67123: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.67127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.67136: variable 'omit' from source: magic vars 27844 1726882764.67406: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.67416: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.67502: variable 'network_state' from source: role '' defaults 27844 1726882764.67509: Evaluated conditional (network_state != {}): False 27844 1726882764.67512: when evaluation is False, skipping this task 27844 1726882764.67517: _execute() done 27844 1726882764.67519: dumping result to json 27844 1726882764.67521: done dumping result, returning 27844 1726882764.67526: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-efa9-466a-00000000006e] 27844 1726882764.67533: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006e 27844 1726882764.67620: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006e 27844 1726882764.67623: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882764.67682: no more pending results, returning what we have 27844 1726882764.67686: results queue empty 27844 1726882764.67687: checking for any_errors_fatal 27844 1726882764.67691: done checking for any_errors_fatal 27844 1726882764.67692: checking for max_fail_percentage 27844 1726882764.67693: done checking for max_fail_percentage 27844 1726882764.67694: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.67695: done checking to see if all hosts have failed 27844 1726882764.67696: getting the remaining hosts for this loop 27844 1726882764.67697: done getting the remaining hosts for this loop 27844 1726882764.67700: getting the next task for host managed_node1 27844 1726882764.67705: done getting next task for host managed_node1 27844 1726882764.67708: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882764.67711: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.67725: getting variables 27844 1726882764.67727: in VariableManager get_vars() 27844 1726882764.67755: Calling all_inventory to load vars for managed_node1 27844 1726882764.67756: Calling groups_inventory to load vars for managed_node1 27844 1726882764.67758: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.67770: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.67772: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.67774: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.68525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.69976: done with get_vars() 27844 1726882764.69997: done getting variables 27844 1726882764.70051: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:24 -0400 (0:00:00.035) 0:00:23.777 ****** 27844 1726882764.70086: entering _queue_task() for managed_node1/fail 27844 1726882764.70314: worker is 1 (out of 1 available) 27844 1726882764.70328: exiting _queue_task() for managed_node1/fail 27844 1726882764.70341: done queuing things up, now waiting for results queue to drain 27844 1726882764.70343: waiting for pending results... 27844 1726882764.70624: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882764.70756: in run() - task 0e448fcc-3ce9-efa9-466a-00000000006f 27844 1726882764.70782: variable 'ansible_search_path' from source: unknown 27844 1726882764.70795: variable 'ansible_search_path' from source: unknown 27844 1726882764.70837: calling self._execute() 27844 1726882764.70943: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.70955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.70974: variable 'omit' from source: magic vars 27844 1726882764.71357: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.71381: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.71514: variable 'network_state' from source: role '' defaults 27844 1726882764.71529: Evaluated conditional (network_state != {}): False 27844 1726882764.71536: when evaluation is False, skipping this task 27844 1726882764.71543: _execute() done 27844 1726882764.71556: dumping result to json 27844 1726882764.71568: done dumping result, returning 27844 1726882764.71580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-efa9-466a-00000000006f] 27844 1726882764.71591: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882764.71735: no more pending results, returning what we have 27844 1726882764.71739: results queue empty 27844 1726882764.71740: checking for any_errors_fatal 27844 1726882764.71747: done checking for any_errors_fatal 27844 1726882764.71748: checking for max_fail_percentage 27844 1726882764.71750: done checking for max_fail_percentage 27844 1726882764.71751: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.71752: done checking to see if all hosts have failed 27844 1726882764.71753: getting the remaining hosts for this loop 27844 1726882764.71754: done getting the remaining hosts for this loop 27844 1726882764.71758: getting the next task for host managed_node1 27844 1726882764.71768: done getting next task for host managed_node1 27844 1726882764.71772: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882764.71777: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.71798: getting variables 27844 1726882764.71800: in VariableManager get_vars() 27844 1726882764.71841: Calling all_inventory to load vars for managed_node1 27844 1726882764.71844: Calling groups_inventory to load vars for managed_node1 27844 1726882764.71847: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.71860: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.71865: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.71870: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.72542: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000006f 27844 1726882764.72546: WORKER PROCESS EXITING 27844 1726882764.75831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.76735: done with get_vars() 27844 1726882764.76750: done getting variables 27844 1726882764.76786: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:24 -0400 (0:00:00.067) 0:00:23.844 ****** 27844 1726882764.76804: entering _queue_task() for managed_node1/fail 27844 1726882764.77019: worker is 1 (out of 1 available) 27844 1726882764.77034: exiting _queue_task() for managed_node1/fail 27844 1726882764.77044: done queuing things up, now waiting for results queue to drain 27844 1726882764.77047: waiting for pending results... 27844 1726882764.77233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882764.77333: in run() - task 0e448fcc-3ce9-efa9-466a-000000000070 27844 1726882764.77345: variable 'ansible_search_path' from source: unknown 27844 1726882764.77350: variable 'ansible_search_path' from source: unknown 27844 1726882764.77383: calling self._execute() 27844 1726882764.77454: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.77458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.77470: variable 'omit' from source: magic vars 27844 1726882764.77739: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.77748: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.77870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882764.79420: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882764.79475: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882764.79501: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882764.79527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882764.79550: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882764.79607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.79626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.79644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.79678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.79689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.79756: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.79770: Evaluated conditional (ansible_distribution_major_version | int > 9): False 27844 1726882764.79773: when evaluation is False, skipping this task 27844 1726882764.79776: _execute() done 27844 1726882764.79778: dumping result to json 27844 1726882764.79781: done dumping result, returning 27844 1726882764.79785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-efa9-466a-000000000070] 27844 1726882764.79795: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000070 27844 1726882764.79881: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000070 27844 1726882764.79883: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 27844 1726882764.79943: no more pending results, returning what we have 27844 1726882764.79947: results queue empty 27844 1726882764.79948: checking for any_errors_fatal 27844 1726882764.79954: done checking for any_errors_fatal 27844 1726882764.79955: checking for max_fail_percentage 27844 1726882764.79957: done checking for max_fail_percentage 27844 1726882764.79958: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.79959: done checking to see if all hosts have failed 27844 1726882764.79959: getting the remaining hosts for this loop 27844 1726882764.79961: done getting the remaining hosts for this loop 27844 1726882764.79969: getting the next task for host managed_node1 27844 1726882764.79974: done getting next task for host managed_node1 27844 1726882764.79982: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882764.79985: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.80004: getting variables 27844 1726882764.80006: in VariableManager get_vars() 27844 1726882764.80039: Calling all_inventory to load vars for managed_node1 27844 1726882764.80042: Calling groups_inventory to load vars for managed_node1 27844 1726882764.80044: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.80052: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.80054: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.80057: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.80828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.81776: done with get_vars() 27844 1726882764.81790: done getting variables 27844 1726882764.81831: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:24 -0400 (0:00:00.050) 0:00:23.895 ****** 27844 1726882764.81854: entering _queue_task() for managed_node1/dnf 27844 1726882764.82051: worker is 1 (out of 1 available) 27844 1726882764.82069: exiting _queue_task() for managed_node1/dnf 27844 1726882764.82082: done queuing things up, now waiting for results queue to drain 27844 1726882764.82083: waiting for pending results... 27844 1726882764.82246: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882764.82337: in run() - task 0e448fcc-3ce9-efa9-466a-000000000071 27844 1726882764.82347: variable 'ansible_search_path' from source: unknown 27844 1726882764.82350: variable 'ansible_search_path' from source: unknown 27844 1726882764.82385: calling self._execute() 27844 1726882764.82455: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.82460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.82469: variable 'omit' from source: magic vars 27844 1726882764.82747: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.82757: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.82894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882764.84673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882764.84721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882764.84748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882764.84778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882764.84798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882764.84852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.84876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.84895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.84922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.84932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.85012: variable 'ansible_distribution' from source: facts 27844 1726882764.85017: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.85027: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27844 1726882764.85105: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882764.85190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.85207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.85224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.85251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.85261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.85294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.85312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.85328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.85353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.85366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.85393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.85409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.85430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.85456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.85468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.85577: variable 'network_connections' from source: task vars 27844 1726882764.85580: variable 'interface1' from source: play vars 27844 1726882764.85629: variable 'interface1' from source: play vars 27844 1726882764.85686: variable 'interface1_mac' from source: set_fact 27844 1726882764.85738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882764.85842: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882764.85880: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882764.85904: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882764.85925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882764.85958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882764.85976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882764.85997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.86016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882764.86057: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882764.86212: variable 'network_connections' from source: task vars 27844 1726882764.86216: variable 'interface1' from source: play vars 27844 1726882764.86257: variable 'interface1' from source: play vars 27844 1726882764.86313: variable 'interface1_mac' from source: set_fact 27844 1726882764.86339: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882764.86343: when evaluation is False, skipping this task 27844 1726882764.86345: _execute() done 27844 1726882764.86348: dumping result to json 27844 1726882764.86350: done dumping result, returning 27844 1726882764.86356: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000071] 27844 1726882764.86360: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000071 27844 1726882764.86449: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000071 27844 1726882764.86452: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882764.86503: no more pending results, returning what we have 27844 1726882764.86507: results queue empty 27844 1726882764.86507: checking for any_errors_fatal 27844 1726882764.86512: done checking for any_errors_fatal 27844 1726882764.86513: checking for max_fail_percentage 27844 1726882764.86514: done checking for max_fail_percentage 27844 1726882764.86515: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.86516: done checking to see if all hosts have failed 27844 1726882764.86516: getting the remaining hosts for this loop 27844 1726882764.86518: done getting the remaining hosts for this loop 27844 1726882764.86521: getting the next task for host managed_node1 27844 1726882764.86525: done getting next task for host managed_node1 27844 1726882764.86529: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882764.86531: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.86546: getting variables 27844 1726882764.86548: in VariableManager get_vars() 27844 1726882764.86590: Calling all_inventory to load vars for managed_node1 27844 1726882764.86593: Calling groups_inventory to load vars for managed_node1 27844 1726882764.86595: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.86603: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.86606: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.86608: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.87461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.88375: done with get_vars() 27844 1726882764.88389: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882764.88438: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:24 -0400 (0:00:00.066) 0:00:23.961 ****** 27844 1726882764.88458: entering _queue_task() for managed_node1/yum 27844 1726882764.88641: worker is 1 (out of 1 available) 27844 1726882764.88655: exiting _queue_task() for managed_node1/yum 27844 1726882764.88668: done queuing things up, now waiting for results queue to drain 27844 1726882764.88670: waiting for pending results... 27844 1726882764.88846: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882764.88941: in run() - task 0e448fcc-3ce9-efa9-466a-000000000072 27844 1726882764.88951: variable 'ansible_search_path' from source: unknown 27844 1726882764.88962: variable 'ansible_search_path' from source: unknown 27844 1726882764.88993: calling self._execute() 27844 1726882764.89063: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.89076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.89084: variable 'omit' from source: magic vars 27844 1726882764.89340: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.89350: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.89467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882764.91016: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882764.91067: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882764.91097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882764.91120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882764.91145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882764.91204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.91224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.91243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.91273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.91284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.91347: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.91388: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27844 1726882764.91470: when evaluation is False, skipping this task 27844 1726882764.91474: _execute() done 27844 1726882764.91478: dumping result to json 27844 1726882764.91480: done dumping result, returning 27844 1726882764.91483: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000072] 27844 1726882764.91485: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000072 27844 1726882764.91557: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000072 27844 1726882764.91560: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27844 1726882764.91639: no more pending results, returning what we have 27844 1726882764.91643: results queue empty 27844 1726882764.91643: checking for any_errors_fatal 27844 1726882764.91647: done checking for any_errors_fatal 27844 1726882764.91648: checking for max_fail_percentage 27844 1726882764.91650: done checking for max_fail_percentage 27844 1726882764.91651: checking to see if all hosts have failed and the running result is not ok 27844 1726882764.91652: done checking to see if all hosts have failed 27844 1726882764.91652: getting the remaining hosts for this loop 27844 1726882764.91654: done getting the remaining hosts for this loop 27844 1726882764.91656: getting the next task for host managed_node1 27844 1726882764.91662: done getting next task for host managed_node1 27844 1726882764.91669: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882764.91672: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882764.91691: getting variables 27844 1726882764.91693: in VariableManager get_vars() 27844 1726882764.91726: Calling all_inventory to load vars for managed_node1 27844 1726882764.91729: Calling groups_inventory to load vars for managed_node1 27844 1726882764.91731: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882764.91739: Calling all_plugins_play to load vars for managed_node1 27844 1726882764.91741: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882764.91744: Calling groups_plugins_play to load vars for managed_node1 27844 1726882764.92511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882764.94242: done with get_vars() 27844 1726882764.94262: done getting variables 27844 1726882764.94321: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:24 -0400 (0:00:00.058) 0:00:24.020 ****** 27844 1726882764.94354: entering _queue_task() for managed_node1/fail 27844 1726882764.94620: worker is 1 (out of 1 available) 27844 1726882764.94632: exiting _queue_task() for managed_node1/fail 27844 1726882764.94643: done queuing things up, now waiting for results queue to drain 27844 1726882764.94645: waiting for pending results... 27844 1726882764.94921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882764.95014: in run() - task 0e448fcc-3ce9-efa9-466a-000000000073 27844 1726882764.95023: variable 'ansible_search_path' from source: unknown 27844 1726882764.95026: variable 'ansible_search_path' from source: unknown 27844 1726882764.95062: calling self._execute() 27844 1726882764.95147: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882764.95151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882764.95159: variable 'omit' from source: magic vars 27844 1726882764.95431: variable 'ansible_distribution_major_version' from source: facts 27844 1726882764.95441: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882764.95525: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882764.95652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882764.97481: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882764.97552: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882764.97605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882764.97642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882764.97685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882764.97759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.97806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.97835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.97892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.97914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.97960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.97994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.98032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.98081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.98100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.98152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882764.98185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882764.98215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.98271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882764.98291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882764.98484: variable 'network_connections' from source: task vars 27844 1726882764.98500: variable 'interface1' from source: play vars 27844 1726882764.98581: variable 'interface1' from source: play vars 27844 1726882764.98669: variable 'interface1_mac' from source: set_fact 27844 1726882764.99117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882764.99291: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882764.99325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882764.99352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882764.99382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882764.99421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882764.99444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882764.99468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882764.99500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882764.99562: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882764.99792: variable 'network_connections' from source: task vars 27844 1726882764.99802: variable 'interface1' from source: play vars 27844 1726882764.99865: variable 'interface1' from source: play vars 27844 1726882764.99939: variable 'interface1_mac' from source: set_fact 27844 1726882764.99980: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882764.99987: when evaluation is False, skipping this task 27844 1726882764.99992: _execute() done 27844 1726882765.00000: dumping result to json 27844 1726882765.00007: done dumping result, returning 27844 1726882765.00017: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000073] 27844 1726882765.00033: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000073 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882765.00183: no more pending results, returning what we have 27844 1726882765.00187: results queue empty 27844 1726882765.00188: checking for any_errors_fatal 27844 1726882765.00193: done checking for any_errors_fatal 27844 1726882765.00194: checking for max_fail_percentage 27844 1726882765.00195: done checking for max_fail_percentage 27844 1726882765.00196: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.00197: done checking to see if all hosts have failed 27844 1726882765.00197: getting the remaining hosts for this loop 27844 1726882765.00199: done getting the remaining hosts for this loop 27844 1726882765.00202: getting the next task for host managed_node1 27844 1726882765.00207: done getting next task for host managed_node1 27844 1726882765.00211: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27844 1726882765.00214: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.00232: getting variables 27844 1726882765.00234: in VariableManager get_vars() 27844 1726882765.00276: Calling all_inventory to load vars for managed_node1 27844 1726882765.00279: Calling groups_inventory to load vars for managed_node1 27844 1726882765.00281: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.00291: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.00294: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.00297: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.01080: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000073 27844 1726882765.01084: WORKER PROCESS EXITING 27844 1726882765.01698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.03484: done with get_vars() 27844 1726882765.03511: done getting variables 27844 1726882765.03574: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:25 -0400 (0:00:00.092) 0:00:24.112 ****** 27844 1726882765.03615: entering _queue_task() for managed_node1/package 27844 1726882765.03903: worker is 1 (out of 1 available) 27844 1726882765.03915: exiting _queue_task() for managed_node1/package 27844 1726882765.03932: done queuing things up, now waiting for results queue to drain 27844 1726882765.03934: waiting for pending results... 27844 1726882765.04238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 27844 1726882765.04401: in run() - task 0e448fcc-3ce9-efa9-466a-000000000074 27844 1726882765.04420: variable 'ansible_search_path' from source: unknown 27844 1726882765.04428: variable 'ansible_search_path' from source: unknown 27844 1726882765.04477: calling self._execute() 27844 1726882765.04589: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.04604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.04619: variable 'omit' from source: magic vars 27844 1726882765.05020: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.05042: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.05252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882765.05516: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882765.05573: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882765.05613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882765.05698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882765.05819: variable 'network_packages' from source: role '' defaults 27844 1726882765.05936: variable '__network_provider_setup' from source: role '' defaults 27844 1726882765.05950: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882765.06030: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882765.06043: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882765.06120: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882765.06324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882765.08849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882765.08918: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882765.08968: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882765.09006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882765.09046: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882765.09131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.09179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.09209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.09258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.09284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.09329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.09360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.09396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.09438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.09455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.09697: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882765.09817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.09859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.09894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.09955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.09990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.10050: variable 'ansible_python' from source: facts 27844 1726882765.10089: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882765.10150: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882765.10208: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882765.10300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.10317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.10337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.10362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.10377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.10408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.10427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.10445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.10474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.10485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.10578: variable 'network_connections' from source: task vars 27844 1726882765.10583: variable 'interface1' from source: play vars 27844 1726882765.10649: variable 'interface1' from source: play vars 27844 1726882765.10731: variable 'interface1_mac' from source: set_fact 27844 1726882765.10795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882765.10814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882765.10834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.10854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882765.10895: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.11068: variable 'network_connections' from source: task vars 27844 1726882765.11074: variable 'interface1' from source: play vars 27844 1726882765.11144: variable 'interface1' from source: play vars 27844 1726882765.11227: variable 'interface1_mac' from source: set_fact 27844 1726882765.11273: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882765.11327: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.11519: variable 'network_connections' from source: task vars 27844 1726882765.11522: variable 'interface1' from source: play vars 27844 1726882765.11572: variable 'interface1' from source: play vars 27844 1726882765.11626: variable 'interface1_mac' from source: set_fact 27844 1726882765.11651: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882765.11705: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882765.12022: variable 'network_connections' from source: task vars 27844 1726882765.12032: variable 'interface1' from source: play vars 27844 1726882765.12111: variable 'interface1' from source: play vars 27844 1726882765.12202: variable 'interface1_mac' from source: set_fact 27844 1726882765.12277: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882765.12349: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882765.12359: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882765.12432: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882765.12769: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882765.13165: variable 'network_connections' from source: task vars 27844 1726882765.13172: variable 'interface1' from source: play vars 27844 1726882765.13215: variable 'interface1' from source: play vars 27844 1726882765.13263: variable 'interface1_mac' from source: set_fact 27844 1726882765.13278: variable 'ansible_distribution' from source: facts 27844 1726882765.13281: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.13286: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.13303: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882765.13407: variable 'ansible_distribution' from source: facts 27844 1726882765.13410: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.13414: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.13427: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882765.13533: variable 'ansible_distribution' from source: facts 27844 1726882765.13536: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.13541: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.13567: variable 'network_provider' from source: set_fact 27844 1726882765.13581: variable 'ansible_facts' from source: unknown 27844 1726882765.13957: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27844 1726882765.13960: when evaluation is False, skipping this task 27844 1726882765.13963: _execute() done 27844 1726882765.13966: dumping result to json 27844 1726882765.13973: done dumping result, returning 27844 1726882765.13978: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-efa9-466a-000000000074] 27844 1726882765.13983: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000074 27844 1726882765.14066: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000074 27844 1726882765.14069: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27844 1726882765.14121: no more pending results, returning what we have 27844 1726882765.14125: results queue empty 27844 1726882765.14126: checking for any_errors_fatal 27844 1726882765.14132: done checking for any_errors_fatal 27844 1726882765.14133: checking for max_fail_percentage 27844 1726882765.14135: done checking for max_fail_percentage 27844 1726882765.14135: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.14136: done checking to see if all hosts have failed 27844 1726882765.14137: getting the remaining hosts for this loop 27844 1726882765.14138: done getting the remaining hosts for this loop 27844 1726882765.14141: getting the next task for host managed_node1 27844 1726882765.14147: done getting next task for host managed_node1 27844 1726882765.14152: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882765.14155: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.14174: getting variables 27844 1726882765.14176: in VariableManager get_vars() 27844 1726882765.14216: Calling all_inventory to load vars for managed_node1 27844 1726882765.14218: Calling groups_inventory to load vars for managed_node1 27844 1726882765.14220: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.14233: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.14236: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.14238: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.15405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.17072: done with get_vars() 27844 1726882765.17087: done getting variables 27844 1726882765.17139: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:25 -0400 (0:00:00.135) 0:00:24.248 ****** 27844 1726882765.17167: entering _queue_task() for managed_node1/package 27844 1726882765.17372: worker is 1 (out of 1 available) 27844 1726882765.17384: exiting _queue_task() for managed_node1/package 27844 1726882765.17395: done queuing things up, now waiting for results queue to drain 27844 1726882765.17397: waiting for pending results... 27844 1726882765.17579: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882765.17668: in run() - task 0e448fcc-3ce9-efa9-466a-000000000075 27844 1726882765.17678: variable 'ansible_search_path' from source: unknown 27844 1726882765.17682: variable 'ansible_search_path' from source: unknown 27844 1726882765.17712: calling self._execute() 27844 1726882765.17794: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.17798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.17807: variable 'omit' from source: magic vars 27844 1726882765.18078: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.18088: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.18170: variable 'network_state' from source: role '' defaults 27844 1726882765.18182: Evaluated conditional (network_state != {}): False 27844 1726882765.18189: when evaluation is False, skipping this task 27844 1726882765.18192: _execute() done 27844 1726882765.18195: dumping result to json 27844 1726882765.18198: done dumping result, returning 27844 1726882765.18204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-000000000075] 27844 1726882765.18210: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000075 27844 1726882765.18300: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000075 27844 1726882765.18302: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882765.18347: no more pending results, returning what we have 27844 1726882765.18351: results queue empty 27844 1726882765.18352: checking for any_errors_fatal 27844 1726882765.18359: done checking for any_errors_fatal 27844 1726882765.18359: checking for max_fail_percentage 27844 1726882765.18361: done checking for max_fail_percentage 27844 1726882765.18362: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.18363: done checking to see if all hosts have failed 27844 1726882765.18365: getting the remaining hosts for this loop 27844 1726882765.18366: done getting the remaining hosts for this loop 27844 1726882765.18370: getting the next task for host managed_node1 27844 1726882765.18375: done getting next task for host managed_node1 27844 1726882765.18378: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882765.18381: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.18397: getting variables 27844 1726882765.18399: in VariableManager get_vars() 27844 1726882765.18432: Calling all_inventory to load vars for managed_node1 27844 1726882765.18434: Calling groups_inventory to load vars for managed_node1 27844 1726882765.18437: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.18445: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.18447: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.18450: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.19492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.21277: done with get_vars() 27844 1726882765.21296: done getting variables 27844 1726882765.21347: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:25 -0400 (0:00:00.042) 0:00:24.290 ****** 27844 1726882765.21381: entering _queue_task() for managed_node1/package 27844 1726882765.21578: worker is 1 (out of 1 available) 27844 1726882765.21606: exiting _queue_task() for managed_node1/package 27844 1726882765.21619: done queuing things up, now waiting for results queue to drain 27844 1726882765.21621: waiting for pending results... 27844 1726882765.21813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882765.21943: in run() - task 0e448fcc-3ce9-efa9-466a-000000000076 27844 1726882765.21968: variable 'ansible_search_path' from source: unknown 27844 1726882765.21978: variable 'ansible_search_path' from source: unknown 27844 1726882765.22017: calling self._execute() 27844 1726882765.22124: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.22144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.22167: variable 'omit' from source: magic vars 27844 1726882765.22531: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.22548: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.22682: variable 'network_state' from source: role '' defaults 27844 1726882765.22702: Evaluated conditional (network_state != {}): False 27844 1726882765.22709: when evaluation is False, skipping this task 27844 1726882765.22716: _execute() done 27844 1726882765.22723: dumping result to json 27844 1726882765.22731: done dumping result, returning 27844 1726882765.22742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-000000000076] 27844 1726882765.22752: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000076 27844 1726882765.22863: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000076 27844 1726882765.22875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882765.22922: no more pending results, returning what we have 27844 1726882765.22926: results queue empty 27844 1726882765.22927: checking for any_errors_fatal 27844 1726882765.22934: done checking for any_errors_fatal 27844 1726882765.22935: checking for max_fail_percentage 27844 1726882765.22937: done checking for max_fail_percentage 27844 1726882765.22938: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.22938: done checking to see if all hosts have failed 27844 1726882765.22939: getting the remaining hosts for this loop 27844 1726882765.22941: done getting the remaining hosts for this loop 27844 1726882765.22944: getting the next task for host managed_node1 27844 1726882765.22949: done getting next task for host managed_node1 27844 1726882765.22953: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882765.22956: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.22977: getting variables 27844 1726882765.22979: in VariableManager get_vars() 27844 1726882765.23017: Calling all_inventory to load vars for managed_node1 27844 1726882765.23020: Calling groups_inventory to load vars for managed_node1 27844 1726882765.23022: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.23031: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.23033: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.23036: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.23879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.24805: done with get_vars() 27844 1726882765.24819: done getting variables 27844 1726882765.24857: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:25 -0400 (0:00:00.035) 0:00:24.325 ****** 27844 1726882765.24883: entering _queue_task() for managed_node1/service 27844 1726882765.25095: worker is 1 (out of 1 available) 27844 1726882765.25106: exiting _queue_task() for managed_node1/service 27844 1726882765.25232: done queuing things up, now waiting for results queue to drain 27844 1726882765.25234: waiting for pending results... 27844 1726882765.25372: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882765.25509: in run() - task 0e448fcc-3ce9-efa9-466a-000000000077 27844 1726882765.25529: variable 'ansible_search_path' from source: unknown 27844 1726882765.25537: variable 'ansible_search_path' from source: unknown 27844 1726882765.25586: calling self._execute() 27844 1726882765.25691: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.25702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.25714: variable 'omit' from source: magic vars 27844 1726882765.26104: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.26124: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.26247: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.26402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882765.27947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882765.28001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882765.28026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882765.28051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882765.28076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882765.28130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.28150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.28172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.28201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.28212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.28243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.28259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.28284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.28310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.28321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.28348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.28364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.28385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.28413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.28423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.28532: variable 'network_connections' from source: task vars 27844 1726882765.28541: variable 'interface1' from source: play vars 27844 1726882765.28594: variable 'interface1' from source: play vars 27844 1726882765.28648: variable 'interface1_mac' from source: set_fact 27844 1726882765.28706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882765.29054: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882765.29083: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882765.29105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882765.29126: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882765.29158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882765.29179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882765.29196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.29213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882765.29259: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882765.29408: variable 'network_connections' from source: task vars 27844 1726882765.29412: variable 'interface1' from source: play vars 27844 1726882765.29454: variable 'interface1' from source: play vars 27844 1726882765.29510: variable 'interface1_mac' from source: set_fact 27844 1726882765.29535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882765.29538: when evaluation is False, skipping this task 27844 1726882765.29541: _execute() done 27844 1726882765.29544: dumping result to json 27844 1726882765.29546: done dumping result, returning 27844 1726882765.29551: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000077] 27844 1726882765.29561: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000077 27844 1726882765.29640: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000077 27844 1726882765.29642: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882765.29685: no more pending results, returning what we have 27844 1726882765.29690: results queue empty 27844 1726882765.29691: checking for any_errors_fatal 27844 1726882765.29697: done checking for any_errors_fatal 27844 1726882765.29698: checking for max_fail_percentage 27844 1726882765.29700: done checking for max_fail_percentage 27844 1726882765.29701: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.29701: done checking to see if all hosts have failed 27844 1726882765.29702: getting the remaining hosts for this loop 27844 1726882765.29704: done getting the remaining hosts for this loop 27844 1726882765.29707: getting the next task for host managed_node1 27844 1726882765.29712: done getting next task for host managed_node1 27844 1726882765.29716: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882765.29719: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.29735: getting variables 27844 1726882765.29737: in VariableManager get_vars() 27844 1726882765.29779: Calling all_inventory to load vars for managed_node1 27844 1726882765.29782: Calling groups_inventory to load vars for managed_node1 27844 1726882765.29784: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.29792: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.29795: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.29797: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.30705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.31615: done with get_vars() 27844 1726882765.31629: done getting variables 27844 1726882765.31670: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:25 -0400 (0:00:00.068) 0:00:24.393 ****** 27844 1726882765.31691: entering _queue_task() for managed_node1/service 27844 1726882765.31871: worker is 1 (out of 1 available) 27844 1726882765.31885: exiting _queue_task() for managed_node1/service 27844 1726882765.31897: done queuing things up, now waiting for results queue to drain 27844 1726882765.31899: waiting for pending results... 27844 1726882765.32071: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882765.32167: in run() - task 0e448fcc-3ce9-efa9-466a-000000000078 27844 1726882765.32180: variable 'ansible_search_path' from source: unknown 27844 1726882765.32183: variable 'ansible_search_path' from source: unknown 27844 1726882765.32211: calling self._execute() 27844 1726882765.32287: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.32291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.32299: variable 'omit' from source: magic vars 27844 1726882765.32555: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.32565: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.32673: variable 'network_provider' from source: set_fact 27844 1726882765.32679: variable 'network_state' from source: role '' defaults 27844 1726882765.32688: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27844 1726882765.32699: variable 'omit' from source: magic vars 27844 1726882765.32731: variable 'omit' from source: magic vars 27844 1726882765.32750: variable 'network_service_name' from source: role '' defaults 27844 1726882765.32804: variable 'network_service_name' from source: role '' defaults 27844 1726882765.32882: variable '__network_provider_setup' from source: role '' defaults 27844 1726882765.32888: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882765.32938: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882765.32945: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882765.32992: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882765.33134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882765.34626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882765.34680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882765.34706: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882765.34729: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882765.34752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882765.34808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.34827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.34845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.34878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.34889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.34919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.34934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.34950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.34982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.34994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.35132: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882765.35208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.35224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.35241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.35267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.35280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.35342: variable 'ansible_python' from source: facts 27844 1726882765.35358: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882765.35417: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882765.35473: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882765.35559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.35578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.35595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.35624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.35632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.35669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.35690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.35706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.35735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.35750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.35843: variable 'network_connections' from source: task vars 27844 1726882765.35850: variable 'interface1' from source: play vars 27844 1726882765.35906: variable 'interface1' from source: play vars 27844 1726882765.35976: variable 'interface1_mac' from source: set_fact 27844 1726882765.36052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882765.36180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882765.36214: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882765.36243: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882765.36275: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882765.36321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882765.36341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882765.36362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.36393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882765.36429: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.36602: variable 'network_connections' from source: task vars 27844 1726882765.36611: variable 'interface1' from source: play vars 27844 1726882765.36659: variable 'interface1' from source: play vars 27844 1726882765.36727: variable 'interface1_mac' from source: set_fact 27844 1726882765.36770: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882765.36821: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.37003: variable 'network_connections' from source: task vars 27844 1726882765.37007: variable 'interface1' from source: play vars 27844 1726882765.37059: variable 'interface1' from source: play vars 27844 1726882765.37118: variable 'interface1_mac' from source: set_fact 27844 1726882765.37143: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882765.37195: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882765.37377: variable 'network_connections' from source: task vars 27844 1726882765.37380: variable 'interface1' from source: play vars 27844 1726882765.37428: variable 'interface1' from source: play vars 27844 1726882765.37490: variable 'interface1_mac' from source: set_fact 27844 1726882765.37533: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882765.37579: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882765.37582: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882765.37624: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882765.37757: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882765.38074: variable 'network_connections' from source: task vars 27844 1726882765.38077: variable 'interface1' from source: play vars 27844 1726882765.38123: variable 'interface1' from source: play vars 27844 1726882765.38223: variable 'interface1_mac' from source: set_fact 27844 1726882765.38226: variable 'ansible_distribution' from source: facts 27844 1726882765.38233: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.38235: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.38241: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882765.38314: variable 'ansible_distribution' from source: facts 27844 1726882765.38317: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.38321: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.38333: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882765.38443: variable 'ansible_distribution' from source: facts 27844 1726882765.38450: variable '__network_rh_distros' from source: role '' defaults 27844 1726882765.38453: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.38480: variable 'network_provider' from source: set_fact 27844 1726882765.38495: variable 'omit' from source: magic vars 27844 1726882765.38515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882765.38534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882765.38548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882765.38562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882765.38572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882765.38593: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882765.38596: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.38599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.38665: Set connection var ansible_shell_type to sh 27844 1726882765.38675: Set connection var ansible_connection to ssh 27844 1726882765.38679: Set connection var ansible_pipelining to False 27844 1726882765.38681: Set connection var ansible_timeout to 10 27844 1726882765.38683: Set connection var ansible_shell_executable to /bin/sh 27844 1726882765.38687: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882765.38707: variable 'ansible_shell_executable' from source: unknown 27844 1726882765.38710: variable 'ansible_connection' from source: unknown 27844 1726882765.38712: variable 'ansible_module_compression' from source: unknown 27844 1726882765.38714: variable 'ansible_shell_type' from source: unknown 27844 1726882765.38716: variable 'ansible_shell_executable' from source: unknown 27844 1726882765.38718: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.38722: variable 'ansible_pipelining' from source: unknown 27844 1726882765.38725: variable 'ansible_timeout' from source: unknown 27844 1726882765.38729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.38799: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882765.38808: variable 'omit' from source: magic vars 27844 1726882765.38812: starting attempt loop 27844 1726882765.38815: running the handler 27844 1726882765.38870: variable 'ansible_facts' from source: unknown 27844 1726882765.39331: _low_level_execute_command(): starting 27844 1726882765.39335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882765.39829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.39843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.39861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882765.39881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.39892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.39929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882765.39940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.40052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.41713: stdout chunk (state=3): >>>/root <<< 27844 1726882765.41814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882765.41857: stderr chunk (state=3): >>><<< 27844 1726882765.41860: stdout chunk (state=3): >>><<< 27844 1726882765.41880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882765.41890: _low_level_execute_command(): starting 27844 1726882765.41898: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666 `" && echo ansible-tmp-1726882765.4187956-29012-125307724067666="` echo /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666 `" ) && sleep 0' 27844 1726882765.42308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.42325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.42345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.42357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.42403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882765.42415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.42511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.44384: stdout chunk (state=3): >>>ansible-tmp-1726882765.4187956-29012-125307724067666=/root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666 <<< 27844 1726882765.44494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882765.44533: stderr chunk (state=3): >>><<< 27844 1726882765.44536: stdout chunk (state=3): >>><<< 27844 1726882765.44547: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882765.4187956-29012-125307724067666=/root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882765.44573: variable 'ansible_module_compression' from source: unknown 27844 1726882765.44611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27844 1726882765.44658: variable 'ansible_facts' from source: unknown 27844 1726882765.44796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/AnsiballZ_systemd.py 27844 1726882765.44895: Sending initial data 27844 1726882765.44903: Sent initial data (156 bytes) 27844 1726882765.45543: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.45552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.45582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 27844 1726882765.45586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.45633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882765.45636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.45737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.47501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882765.47592: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882765.47687: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpl79_6u4z /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/AnsiballZ_systemd.py <<< 27844 1726882765.47777: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882765.49725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882765.49810: stderr chunk (state=3): >>><<< 27844 1726882765.49813: stdout chunk (state=3): >>><<< 27844 1726882765.49826: done transferring module to remote 27844 1726882765.49834: _low_level_execute_command(): starting 27844 1726882765.49838: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/ /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/AnsiballZ_systemd.py && sleep 0' 27844 1726882765.50254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.50257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.50295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.50298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882765.50301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882765.50304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.50354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882765.50358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.50455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.52224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882765.52271: stderr chunk (state=3): >>><<< 27844 1726882765.52275: stdout chunk (state=3): >>><<< 27844 1726882765.52286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882765.52289: _low_level_execute_command(): starting 27844 1726882765.52293: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/AnsiballZ_systemd.py && sleep 0' 27844 1726882765.52694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882765.52706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.52723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882765.52734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882765.52743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.52790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882765.52809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.52908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.78030: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16240640", "MemoryAvailable": "infinity", "CPUUsageNSec": "1347382000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 27844 1726882765.78069: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perp<<< 27844 1726882765.78073: stdout chunk (state=3): >>>etual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27844 1726882765.79737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882765.79823: stderr chunk (state=3): >>><<< 27844 1726882765.79826: stdout chunk (state=3): >>><<< 27844 1726882765.79873: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16240640", "MemoryAvailable": "infinity", "CPUUsageNSec": "1347382000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882765.80140: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882765.80144: _low_level_execute_command(): starting 27844 1726882765.80147: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882765.4187956-29012-125307724067666/ > /dev/null 2>&1 && sleep 0' 27844 1726882765.80752: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882765.80769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882765.80784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.80802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.80848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882765.80860: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882765.80877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.80894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882765.80904: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882765.80917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882765.80935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882765.80948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882765.80962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882765.80979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882765.80989: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882765.81001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882765.81091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882765.81112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882765.81126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882765.81255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882765.83085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882765.83163: stderr chunk (state=3): >>><<< 27844 1726882765.83178: stdout chunk (state=3): >>><<< 27844 1726882765.83479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882765.83482: handler run complete 27844 1726882765.83485: attempt loop complete, returning result 27844 1726882765.83487: _execute() done 27844 1726882765.83489: dumping result to json 27844 1726882765.83491: done dumping result, returning 27844 1726882765.83494: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-efa9-466a-000000000078] 27844 1726882765.83496: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000078 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882765.83730: no more pending results, returning what we have 27844 1726882765.83734: results queue empty 27844 1726882765.83735: checking for any_errors_fatal 27844 1726882765.83741: done checking for any_errors_fatal 27844 1726882765.83742: checking for max_fail_percentage 27844 1726882765.83744: done checking for max_fail_percentage 27844 1726882765.83745: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.83746: done checking to see if all hosts have failed 27844 1726882765.83747: getting the remaining hosts for this loop 27844 1726882765.83748: done getting the remaining hosts for this loop 27844 1726882765.83752: getting the next task for host managed_node1 27844 1726882765.83759: done getting next task for host managed_node1 27844 1726882765.83768: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882765.83771: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.83785: getting variables 27844 1726882765.83788: in VariableManager get_vars() 27844 1726882765.83829: Calling all_inventory to load vars for managed_node1 27844 1726882765.83832: Calling groups_inventory to load vars for managed_node1 27844 1726882765.83834: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.83846: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.83849: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.83852: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.84673: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000078 27844 1726882765.84677: WORKER PROCESS EXITING 27844 1726882765.85774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.87586: done with get_vars() 27844 1726882765.87615: done getting variables 27844 1726882765.87703: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:25 -0400 (0:00:00.560) 0:00:24.954 ****** 27844 1726882765.87747: entering _queue_task() for managed_node1/service 27844 1726882765.88149: worker is 1 (out of 1 available) 27844 1726882765.88165: exiting _queue_task() for managed_node1/service 27844 1726882765.88180: done queuing things up, now waiting for results queue to drain 27844 1726882765.88182: waiting for pending results... 27844 1726882765.88498: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882765.88657: in run() - task 0e448fcc-3ce9-efa9-466a-000000000079 27844 1726882765.88687: variable 'ansible_search_path' from source: unknown 27844 1726882765.88696: variable 'ansible_search_path' from source: unknown 27844 1726882765.88972: calling self._execute() 27844 1726882765.88976: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.88979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.88981: variable 'omit' from source: magic vars 27844 1726882765.89240: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.89252: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882765.89371: variable 'network_provider' from source: set_fact 27844 1726882765.89374: Evaluated conditional (network_provider == "nm"): True 27844 1726882765.89468: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882765.89549: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882765.89701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882765.92106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882765.92168: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882765.92199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882765.92231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882765.92256: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882765.92333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.92358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.92385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.92426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.92439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.92484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.92506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.92529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.92570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.92581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.92619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882765.92640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882765.92662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.92704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882765.92717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882765.92845: variable 'network_connections' from source: task vars 27844 1726882765.92856: variable 'interface1' from source: play vars 27844 1726882765.92928: variable 'interface1' from source: play vars 27844 1726882765.93005: variable 'interface1_mac' from source: set_fact 27844 1726882765.93096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882765.93250: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882765.93287: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882765.93315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882765.93344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882765.93387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882765.93407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882765.93431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882765.93455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882765.93502: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882765.93732: variable 'network_connections' from source: task vars 27844 1726882765.93738: variable 'interface1' from source: play vars 27844 1726882765.93798: variable 'interface1' from source: play vars 27844 1726882765.93869: variable 'interface1_mac' from source: set_fact 27844 1726882765.93908: Evaluated conditional (__network_wpa_supplicant_required): False 27844 1726882765.93911: when evaluation is False, skipping this task 27844 1726882765.93914: _execute() done 27844 1726882765.93916: dumping result to json 27844 1726882765.93919: done dumping result, returning 27844 1726882765.93929: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-efa9-466a-000000000079] 27844 1726882765.93931: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000079 27844 1726882765.94022: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000079 27844 1726882765.94026: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27844 1726882765.94074: no more pending results, returning what we have 27844 1726882765.94078: results queue empty 27844 1726882765.94079: checking for any_errors_fatal 27844 1726882765.94100: done checking for any_errors_fatal 27844 1726882765.94101: checking for max_fail_percentage 27844 1726882765.94102: done checking for max_fail_percentage 27844 1726882765.94103: checking to see if all hosts have failed and the running result is not ok 27844 1726882765.94104: done checking to see if all hosts have failed 27844 1726882765.94104: getting the remaining hosts for this loop 27844 1726882765.94106: done getting the remaining hosts for this loop 27844 1726882765.94109: getting the next task for host managed_node1 27844 1726882765.94114: done getting next task for host managed_node1 27844 1726882765.94118: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882765.94121: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882765.94138: getting variables 27844 1726882765.94140: in VariableManager get_vars() 27844 1726882765.94181: Calling all_inventory to load vars for managed_node1 27844 1726882765.94184: Calling groups_inventory to load vars for managed_node1 27844 1726882765.94186: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882765.94195: Calling all_plugins_play to load vars for managed_node1 27844 1726882765.94197: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882765.94200: Calling groups_plugins_play to load vars for managed_node1 27844 1726882765.95809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882765.97641: done with get_vars() 27844 1726882765.97662: done getting variables 27844 1726882765.97723: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:25 -0400 (0:00:00.100) 0:00:25.054 ****** 27844 1726882765.97770: entering _queue_task() for managed_node1/service 27844 1726882765.98082: worker is 1 (out of 1 available) 27844 1726882765.98094: exiting _queue_task() for managed_node1/service 27844 1726882765.98108: done queuing things up, now waiting for results queue to drain 27844 1726882765.98110: waiting for pending results... 27844 1726882765.98748: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882765.99050: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007a 27844 1726882765.99074: variable 'ansible_search_path' from source: unknown 27844 1726882765.99101: variable 'ansible_search_path' from source: unknown 27844 1726882765.99180: calling self._execute() 27844 1726882765.99460: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882765.99477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882765.99494: variable 'omit' from source: magic vars 27844 1726882765.99956: variable 'ansible_distribution_major_version' from source: facts 27844 1726882765.99988: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.00129: variable 'network_provider' from source: set_fact 27844 1726882766.00141: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882766.00148: when evaluation is False, skipping this task 27844 1726882766.00162: _execute() done 27844 1726882766.00184: dumping result to json 27844 1726882766.00192: done dumping result, returning 27844 1726882766.00202: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-efa9-466a-00000000007a] 27844 1726882766.00213: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882766.00353: no more pending results, returning what we have 27844 1726882766.00358: results queue empty 27844 1726882766.00359: checking for any_errors_fatal 27844 1726882766.00374: done checking for any_errors_fatal 27844 1726882766.00375: checking for max_fail_percentage 27844 1726882766.00377: done checking for max_fail_percentage 27844 1726882766.00378: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.00379: done checking to see if all hosts have failed 27844 1726882766.00380: getting the remaining hosts for this loop 27844 1726882766.00382: done getting the remaining hosts for this loop 27844 1726882766.00385: getting the next task for host managed_node1 27844 1726882766.00392: done getting next task for host managed_node1 27844 1726882766.00396: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882766.00400: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.00421: getting variables 27844 1726882766.00424: in VariableManager get_vars() 27844 1726882766.00469: Calling all_inventory to load vars for managed_node1 27844 1726882766.00472: Calling groups_inventory to load vars for managed_node1 27844 1726882766.00475: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.00488: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.00491: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.00494: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.01650: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007a 27844 1726882766.01654: WORKER PROCESS EXITING 27844 1726882766.02536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.04353: done with get_vars() 27844 1726882766.04378: done getting variables 27844 1726882766.04434: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:26 -0400 (0:00:00.066) 0:00:25.121 ****** 27844 1726882766.04470: entering _queue_task() for managed_node1/copy 27844 1726882766.04715: worker is 1 (out of 1 available) 27844 1726882766.04726: exiting _queue_task() for managed_node1/copy 27844 1726882766.04738: done queuing things up, now waiting for results queue to drain 27844 1726882766.04739: waiting for pending results... 27844 1726882766.05019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882766.05159: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007b 27844 1726882766.05186: variable 'ansible_search_path' from source: unknown 27844 1726882766.05194: variable 'ansible_search_path' from source: unknown 27844 1726882766.05235: calling self._execute() 27844 1726882766.05337: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.05348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.05362: variable 'omit' from source: magic vars 27844 1726882766.05750: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.05773: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.05901: variable 'network_provider' from source: set_fact 27844 1726882766.05912: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882766.05919: when evaluation is False, skipping this task 27844 1726882766.05926: _execute() done 27844 1726882766.05932: dumping result to json 27844 1726882766.05943: done dumping result, returning 27844 1726882766.05954: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-efa9-466a-00000000007b] 27844 1726882766.05965: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007b skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27844 1726882766.06108: no more pending results, returning what we have 27844 1726882766.06112: results queue empty 27844 1726882766.06113: checking for any_errors_fatal 27844 1726882766.06118: done checking for any_errors_fatal 27844 1726882766.06119: checking for max_fail_percentage 27844 1726882766.06121: done checking for max_fail_percentage 27844 1726882766.06122: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.06123: done checking to see if all hosts have failed 27844 1726882766.06124: getting the remaining hosts for this loop 27844 1726882766.06126: done getting the remaining hosts for this loop 27844 1726882766.06129: getting the next task for host managed_node1 27844 1726882766.06135: done getting next task for host managed_node1 27844 1726882766.06139: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882766.06142: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.06162: getting variables 27844 1726882766.06168: in VariableManager get_vars() 27844 1726882766.06209: Calling all_inventory to load vars for managed_node1 27844 1726882766.06211: Calling groups_inventory to load vars for managed_node1 27844 1726882766.06214: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.06226: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.06228: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.06231: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.07284: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007b 27844 1726882766.07288: WORKER PROCESS EXITING 27844 1726882766.07880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.09630: done with get_vars() 27844 1726882766.09650: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:26 -0400 (0:00:00.052) 0:00:25.173 ****** 27844 1726882766.09732: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882766.09984: worker is 1 (out of 1 available) 27844 1726882766.09996: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882766.10008: done queuing things up, now waiting for results queue to drain 27844 1726882766.10010: waiting for pending results... 27844 1726882766.10283: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882766.10410: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007c 27844 1726882766.10430: variable 'ansible_search_path' from source: unknown 27844 1726882766.10437: variable 'ansible_search_path' from source: unknown 27844 1726882766.10483: calling self._execute() 27844 1726882766.10590: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.10602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.10618: variable 'omit' from source: magic vars 27844 1726882766.10980: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.11001: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.11011: variable 'omit' from source: magic vars 27844 1726882766.11070: variable 'omit' from source: magic vars 27844 1726882766.11224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882766.13949: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882766.14023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882766.14086: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882766.14191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882766.14296: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882766.14483: variable 'network_provider' from source: set_fact 27844 1726882766.14737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882766.14840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882766.14879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882766.14977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882766.15151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882766.15234: variable 'omit' from source: magic vars 27844 1726882766.15471: variable 'omit' from source: magic vars 27844 1726882766.15674: variable 'network_connections' from source: task vars 27844 1726882766.15735: variable 'interface1' from source: play vars 27844 1726882766.15927: variable 'interface1' from source: play vars 27844 1726882766.16101: variable 'interface1_mac' from source: set_fact 27844 1726882766.16583: variable 'omit' from source: magic vars 27844 1726882766.16596: variable '__lsr_ansible_managed' from source: task vars 27844 1726882766.16659: variable '__lsr_ansible_managed' from source: task vars 27844 1726882766.17316: Loaded config def from plugin (lookup/template) 27844 1726882766.17326: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27844 1726882766.17358: File lookup term: get_ansible_managed.j2 27844 1726882766.17448: variable 'ansible_search_path' from source: unknown 27844 1726882766.17458: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27844 1726882766.17497: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27844 1726882766.17586: variable 'ansible_search_path' from source: unknown 27844 1726882766.24992: variable 'ansible_managed' from source: unknown 27844 1726882766.25125: variable 'omit' from source: magic vars 27844 1726882766.25159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882766.25197: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882766.25218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882766.25237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.25250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.25285: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882766.25295: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.25307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.25403: Set connection var ansible_shell_type to sh 27844 1726882766.25416: Set connection var ansible_connection to ssh 27844 1726882766.25427: Set connection var ansible_pipelining to False 27844 1726882766.25439: Set connection var ansible_timeout to 10 27844 1726882766.25448: Set connection var ansible_shell_executable to /bin/sh 27844 1726882766.25457: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882766.25511: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.25525: variable 'ansible_connection' from source: unknown 27844 1726882766.25531: variable 'ansible_module_compression' from source: unknown 27844 1726882766.25537: variable 'ansible_shell_type' from source: unknown 27844 1726882766.25542: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.25549: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.25555: variable 'ansible_pipelining' from source: unknown 27844 1726882766.25561: variable 'ansible_timeout' from source: unknown 27844 1726882766.25578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.26271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882766.26296: variable 'omit' from source: magic vars 27844 1726882766.26311: starting attempt loop 27844 1726882766.26317: running the handler 27844 1726882766.26333: _low_level_execute_command(): starting 27844 1726882766.26342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882766.27591: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882766.27607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.27628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.27645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.27686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.27697: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882766.27708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.27727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882766.27740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882766.27749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882766.27759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.27778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.27794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.27806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.27818: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882766.27834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.27923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.27948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882766.27974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.28197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.29771: stdout chunk (state=3): >>>/root <<< 27844 1726882766.29975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882766.29981: stdout chunk (state=3): >>><<< 27844 1726882766.29984: stderr chunk (state=3): >>><<< 27844 1726882766.30116: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882766.30120: _low_level_execute_command(): starting 27844 1726882766.30123: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821 `" && echo ansible-tmp-1726882766.3001218-29033-6438907137821="` echo /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821 `" ) && sleep 0' 27844 1726882766.30712: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882766.30724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.30736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.30803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.30852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.30901: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882766.30915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.30930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882766.30939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882766.30994: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882766.31012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.31024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.31037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.31047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.31093: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882766.31113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.32215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.32245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882766.32262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.32818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.34272: stdout chunk (state=3): >>>ansible-tmp-1726882766.3001218-29033-6438907137821=/root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821 <<< 27844 1726882766.34454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882766.34457: stdout chunk (state=3): >>><<< 27844 1726882766.34460: stderr chunk (state=3): >>><<< 27844 1726882766.34784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882766.3001218-29033-6438907137821=/root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882766.34791: variable 'ansible_module_compression' from source: unknown 27844 1726882766.34794: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27844 1726882766.34797: variable 'ansible_facts' from source: unknown 27844 1726882766.34799: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/AnsiballZ_network_connections.py 27844 1726882766.35603: Sending initial data 27844 1726882766.35607: Sent initial data (166 bytes) 27844 1726882766.36618: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.36621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.36662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.36668: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.36670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.36672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.36740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.36744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.36854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.38589: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 27844 1726882766.38592: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882766.38677: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882766.38771: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpkrk3kwlt /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/AnsiballZ_network_connections.py <<< 27844 1726882766.38870: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882766.40533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882766.40662: stderr chunk (state=3): >>><<< 27844 1726882766.40667: stdout chunk (state=3): >>><<< 27844 1726882766.40692: done transferring module to remote 27844 1726882766.40705: _low_level_execute_command(): starting 27844 1726882766.40709: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/ /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/AnsiballZ_network_connections.py && sleep 0' 27844 1726882766.41454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882766.41464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.41480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.41505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.41540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.41556: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882766.41568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.41584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882766.41592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882766.41599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882766.41607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.41616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.41628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.41635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.41642: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882766.41653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.41744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.41761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882766.41786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.41904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.43687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882766.43690: stdout chunk (state=3): >>><<< 27844 1726882766.43697: stderr chunk (state=3): >>><<< 27844 1726882766.43712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882766.43715: _low_level_execute_command(): starting 27844 1726882766.43720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/AnsiballZ_network_connections.py && sleep 0' 27844 1726882766.44333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882766.44347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.44357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.44375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.44410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.44417: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882766.44426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.44439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882766.44452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882766.44458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882766.44467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.44481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.44491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.44498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.44505: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882766.44514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.44596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.44606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882766.44620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.44745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.69583: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "8e:7f:0b:ff:ac:0e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "8e:7f:0b:ff:ac:0e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27844 1726882766.71092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882766.71096: stdout chunk (state=3): >>><<< 27844 1726882766.71098: stderr chunk (state=3): >>><<< 27844 1726882766.71250: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "8e:7f:0b:ff:ac:0e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "8e:7f:0b:ff:ac:0e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882766.71254: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': '8e:7f:0b:ff:ac:0e', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882766.71257: _low_level_execute_command(): starting 27844 1726882766.71260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882766.3001218-29033-6438907137821/ > /dev/null 2>&1 && sleep 0' 27844 1726882766.71818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882766.71832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.71845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.71862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.71906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.71919: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882766.71932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.71949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882766.71959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882766.71978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882766.71989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882766.72001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882766.72015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882766.72030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882766.72042: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882766.72055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882766.72133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882766.72151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882766.72172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882766.72307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882766.74201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882766.74205: stdout chunk (state=3): >>><<< 27844 1726882766.74212: stderr chunk (state=3): >>><<< 27844 1726882766.74227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882766.74233: handler run complete 27844 1726882766.74276: attempt loop complete, returning result 27844 1726882766.74280: _execute() done 27844 1726882766.74282: dumping result to json 27844 1726882766.74288: done dumping result, returning 27844 1726882766.74298: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-efa9-466a-00000000007c] 27844 1726882766.74303: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007c 27844 1726882766.74426: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007c 27844 1726882766.74428: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "8e:7f:0b:ff:ac:0e", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828 27844 1726882766.74548: no more pending results, returning what we have 27844 1726882766.74552: results queue empty 27844 1726882766.74553: checking for any_errors_fatal 27844 1726882766.74559: done checking for any_errors_fatal 27844 1726882766.74560: checking for max_fail_percentage 27844 1726882766.74561: done checking for max_fail_percentage 27844 1726882766.74562: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.74570: done checking to see if all hosts have failed 27844 1726882766.74571: getting the remaining hosts for this loop 27844 1726882766.74572: done getting the remaining hosts for this loop 27844 1726882766.74576: getting the next task for host managed_node1 27844 1726882766.74581: done getting next task for host managed_node1 27844 1726882766.74584: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882766.74587: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.74596: getting variables 27844 1726882766.74598: in VariableManager get_vars() 27844 1726882766.74632: Calling all_inventory to load vars for managed_node1 27844 1726882766.74635: Calling groups_inventory to load vars for managed_node1 27844 1726882766.74637: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.74645: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.74648: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.74650: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.77279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.80508: done with get_vars() 27844 1726882766.80528: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:26 -0400 (0:00:00.708) 0:00:25.882 ****** 27844 1726882766.80618: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882766.80846: worker is 1 (out of 1 available) 27844 1726882766.80861: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882766.80879: done queuing things up, now waiting for results queue to drain 27844 1726882766.80881: waiting for pending results... 27844 1726882766.81058: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882766.81151: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007d 27844 1726882766.81163: variable 'ansible_search_path' from source: unknown 27844 1726882766.81170: variable 'ansible_search_path' from source: unknown 27844 1726882766.81199: calling self._execute() 27844 1726882766.81273: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.81277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.81288: variable 'omit' from source: magic vars 27844 1726882766.81554: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.81564: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.81646: variable 'network_state' from source: role '' defaults 27844 1726882766.81654: Evaluated conditional (network_state != {}): False 27844 1726882766.81658: when evaluation is False, skipping this task 27844 1726882766.81661: _execute() done 27844 1726882766.81663: dumping result to json 27844 1726882766.81668: done dumping result, returning 27844 1726882766.81672: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-efa9-466a-00000000007d] 27844 1726882766.81677: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007d 27844 1726882766.81758: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007d 27844 1726882766.81760: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882766.81817: no more pending results, returning what we have 27844 1726882766.81822: results queue empty 27844 1726882766.81823: checking for any_errors_fatal 27844 1726882766.81835: done checking for any_errors_fatal 27844 1726882766.81835: checking for max_fail_percentage 27844 1726882766.81837: done checking for max_fail_percentage 27844 1726882766.81838: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.81839: done checking to see if all hosts have failed 27844 1726882766.81839: getting the remaining hosts for this loop 27844 1726882766.81841: done getting the remaining hosts for this loop 27844 1726882766.81844: getting the next task for host managed_node1 27844 1726882766.81850: done getting next task for host managed_node1 27844 1726882766.81853: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882766.81856: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.81879: getting variables 27844 1726882766.81881: in VariableManager get_vars() 27844 1726882766.81918: Calling all_inventory to load vars for managed_node1 27844 1726882766.81921: Calling groups_inventory to load vars for managed_node1 27844 1726882766.81923: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.81931: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.81934: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.81936: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.83241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.84268: done with get_vars() 27844 1726882766.84284: done getting variables 27844 1726882766.84324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:26 -0400 (0:00:00.037) 0:00:25.920 ****** 27844 1726882766.84346: entering _queue_task() for managed_node1/debug 27844 1726882766.84530: worker is 1 (out of 1 available) 27844 1726882766.84542: exiting _queue_task() for managed_node1/debug 27844 1726882766.84553: done queuing things up, now waiting for results queue to drain 27844 1726882766.84555: waiting for pending results... 27844 1726882766.84739: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882766.84821: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007e 27844 1726882766.84836: variable 'ansible_search_path' from source: unknown 27844 1726882766.84839: variable 'ansible_search_path' from source: unknown 27844 1726882766.84868: calling self._execute() 27844 1726882766.84945: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.84949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.84957: variable 'omit' from source: magic vars 27844 1726882766.85474: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.85517: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.85521: variable 'omit' from source: magic vars 27844 1726882766.85571: variable 'omit' from source: magic vars 27844 1726882766.85650: variable 'omit' from source: magic vars 27844 1726882766.85712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882766.85759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882766.85785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882766.85815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.85839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.85880: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882766.85889: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.85897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.86033: Set connection var ansible_shell_type to sh 27844 1726882766.86041: Set connection var ansible_connection to ssh 27844 1726882766.86051: Set connection var ansible_pipelining to False 27844 1726882766.86061: Set connection var ansible_timeout to 10 27844 1726882766.86080: Set connection var ansible_shell_executable to /bin/sh 27844 1726882766.86093: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882766.86125: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.86140: variable 'ansible_connection' from source: unknown 27844 1726882766.86147: variable 'ansible_module_compression' from source: unknown 27844 1726882766.86153: variable 'ansible_shell_type' from source: unknown 27844 1726882766.86158: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.86168: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.86176: variable 'ansible_pipelining' from source: unknown 27844 1726882766.86181: variable 'ansible_timeout' from source: unknown 27844 1726882766.86187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.86336: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882766.86360: variable 'omit' from source: magic vars 27844 1726882766.86377: starting attempt loop 27844 1726882766.86384: running the handler 27844 1726882766.86536: variable '__network_connections_result' from source: set_fact 27844 1726882766.86602: handler run complete 27844 1726882766.86629: attempt loop complete, returning result 27844 1726882766.86637: _execute() done 27844 1726882766.86643: dumping result to json 27844 1726882766.86649: done dumping result, returning 27844 1726882766.86659: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-efa9-466a-00000000007e] 27844 1726882766.86688: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007e 27844 1726882766.87245: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007e 27844 1726882766.87248: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828" ] } 27844 1726882766.87332: no more pending results, returning what we have 27844 1726882766.87336: results queue empty 27844 1726882766.87336: checking for any_errors_fatal 27844 1726882766.87341: done checking for any_errors_fatal 27844 1726882766.87341: checking for max_fail_percentage 27844 1726882766.87343: done checking for max_fail_percentage 27844 1726882766.87343: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.87344: done checking to see if all hosts have failed 27844 1726882766.87345: getting the remaining hosts for this loop 27844 1726882766.87346: done getting the remaining hosts for this loop 27844 1726882766.87349: getting the next task for host managed_node1 27844 1726882766.87354: done getting next task for host managed_node1 27844 1726882766.87360: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882766.87363: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.87381: getting variables 27844 1726882766.87383: in VariableManager get_vars() 27844 1726882766.87416: Calling all_inventory to load vars for managed_node1 27844 1726882766.87419: Calling groups_inventory to load vars for managed_node1 27844 1726882766.87421: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.87430: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.87433: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.87436: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.89034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.90983: done with get_vars() 27844 1726882766.91009: done getting variables 27844 1726882766.91090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:26 -0400 (0:00:00.067) 0:00:25.987 ****** 27844 1726882766.91124: entering _queue_task() for managed_node1/debug 27844 1726882766.91491: worker is 1 (out of 1 available) 27844 1726882766.91504: exiting _queue_task() for managed_node1/debug 27844 1726882766.91520: done queuing things up, now waiting for results queue to drain 27844 1726882766.91522: waiting for pending results... 27844 1726882766.92315: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882766.92452: in run() - task 0e448fcc-3ce9-efa9-466a-00000000007f 27844 1726882766.92471: variable 'ansible_search_path' from source: unknown 27844 1726882766.92474: variable 'ansible_search_path' from source: unknown 27844 1726882766.92523: calling self._execute() 27844 1726882766.92611: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.92614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.92623: variable 'omit' from source: magic vars 27844 1726882766.92999: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.93009: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.93015: variable 'omit' from source: magic vars 27844 1726882766.93061: variable 'omit' from source: magic vars 27844 1726882766.93090: variable 'omit' from source: magic vars 27844 1726882766.93122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882766.93155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882766.93171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882766.93185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.93194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882766.93215: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882766.93218: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.93221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.93292: Set connection var ansible_shell_type to sh 27844 1726882766.93297: Set connection var ansible_connection to ssh 27844 1726882766.93300: Set connection var ansible_pipelining to False 27844 1726882766.93306: Set connection var ansible_timeout to 10 27844 1726882766.93311: Set connection var ansible_shell_executable to /bin/sh 27844 1726882766.93316: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882766.93336: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.93340: variable 'ansible_connection' from source: unknown 27844 1726882766.93342: variable 'ansible_module_compression' from source: unknown 27844 1726882766.93345: variable 'ansible_shell_type' from source: unknown 27844 1726882766.93347: variable 'ansible_shell_executable' from source: unknown 27844 1726882766.93349: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.93351: variable 'ansible_pipelining' from source: unknown 27844 1726882766.93353: variable 'ansible_timeout' from source: unknown 27844 1726882766.93358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.93455: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882766.93471: variable 'omit' from source: magic vars 27844 1726882766.93475: starting attempt loop 27844 1726882766.93479: running the handler 27844 1726882766.93514: variable '__network_connections_result' from source: set_fact 27844 1726882766.93574: variable '__network_connections_result' from source: set_fact 27844 1726882766.93670: handler run complete 27844 1726882766.93690: attempt loop complete, returning result 27844 1726882766.93693: _execute() done 27844 1726882766.93695: dumping result to json 27844 1726882766.93699: done dumping result, returning 27844 1726882766.93705: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-efa9-466a-00000000007f] 27844 1726882766.93710: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "8e:7f:0b:ff:ac:0e", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 3f556ad4-433c-441e-8cb1-58aca9efd828" ] } } 27844 1726882766.93903: no more pending results, returning what we have 27844 1726882766.93907: results queue empty 27844 1726882766.93907: checking for any_errors_fatal 27844 1726882766.93913: done checking for any_errors_fatal 27844 1726882766.93914: checking for max_fail_percentage 27844 1726882766.93915: done checking for max_fail_percentage 27844 1726882766.93916: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.93917: done checking to see if all hosts have failed 27844 1726882766.93918: getting the remaining hosts for this loop 27844 1726882766.93919: done getting the remaining hosts for this loop 27844 1726882766.93922: getting the next task for host managed_node1 27844 1726882766.93926: done getting next task for host managed_node1 27844 1726882766.93930: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882766.93933: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.93942: getting variables 27844 1726882766.93943: in VariableManager get_vars() 27844 1726882766.93982: Calling all_inventory to load vars for managed_node1 27844 1726882766.93984: Calling groups_inventory to load vars for managed_node1 27844 1726882766.93986: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.93993: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000007f 27844 1726882766.94003: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.94008: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.94011: Calling groups_plugins_play to load vars for managed_node1 27844 1726882766.94584: WORKER PROCESS EXITING 27844 1726882766.95681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882766.97686: done with get_vars() 27844 1726882766.97707: done getting variables 27844 1726882766.97769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:26 -0400 (0:00:00.066) 0:00:26.054 ****** 27844 1726882766.97805: entering _queue_task() for managed_node1/debug 27844 1726882766.98086: worker is 1 (out of 1 available) 27844 1726882766.98102: exiting _queue_task() for managed_node1/debug 27844 1726882766.98114: done queuing things up, now waiting for results queue to drain 27844 1726882766.98115: waiting for pending results... 27844 1726882766.98409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882766.98560: in run() - task 0e448fcc-3ce9-efa9-466a-000000000080 27844 1726882766.98585: variable 'ansible_search_path' from source: unknown 27844 1726882766.98592: variable 'ansible_search_path' from source: unknown 27844 1726882766.98631: calling self._execute() 27844 1726882766.98741: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882766.98758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882766.98778: variable 'omit' from source: magic vars 27844 1726882766.99174: variable 'ansible_distribution_major_version' from source: facts 27844 1726882766.99195: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882766.99331: variable 'network_state' from source: role '' defaults 27844 1726882766.99344: Evaluated conditional (network_state != {}): False 27844 1726882766.99350: when evaluation is False, skipping this task 27844 1726882766.99356: _execute() done 27844 1726882766.99363: dumping result to json 27844 1726882766.99375: done dumping result, returning 27844 1726882766.99386: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-efa9-466a-000000000080] 27844 1726882766.99394: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000080 27844 1726882766.99503: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000080 27844 1726882766.99511: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 27844 1726882766.99575: no more pending results, returning what we have 27844 1726882766.99579: results queue empty 27844 1726882766.99580: checking for any_errors_fatal 27844 1726882766.99592: done checking for any_errors_fatal 27844 1726882766.99593: checking for max_fail_percentage 27844 1726882766.99595: done checking for max_fail_percentage 27844 1726882766.99596: checking to see if all hosts have failed and the running result is not ok 27844 1726882766.99596: done checking to see if all hosts have failed 27844 1726882766.99597: getting the remaining hosts for this loop 27844 1726882766.99599: done getting the remaining hosts for this loop 27844 1726882766.99603: getting the next task for host managed_node1 27844 1726882766.99608: done getting next task for host managed_node1 27844 1726882766.99612: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882766.99616: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882766.99638: getting variables 27844 1726882766.99641: in VariableManager get_vars() 27844 1726882766.99683: Calling all_inventory to load vars for managed_node1 27844 1726882766.99685: Calling groups_inventory to load vars for managed_node1 27844 1726882766.99688: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882766.99700: Calling all_plugins_play to load vars for managed_node1 27844 1726882766.99703: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882766.99706: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.01358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.03080: done with get_vars() 27844 1726882767.03101: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:27 -0400 (0:00:00.053) 0:00:26.108 ****** 27844 1726882767.03199: entering _queue_task() for managed_node1/ping 27844 1726882767.03459: worker is 1 (out of 1 available) 27844 1726882767.03474: exiting _queue_task() for managed_node1/ping 27844 1726882767.03486: done queuing things up, now waiting for results queue to drain 27844 1726882767.03488: waiting for pending results... 27844 1726882767.03770: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882767.03910: in run() - task 0e448fcc-3ce9-efa9-466a-000000000081 27844 1726882767.03933: variable 'ansible_search_path' from source: unknown 27844 1726882767.03940: variable 'ansible_search_path' from source: unknown 27844 1726882767.03982: calling self._execute() 27844 1726882767.04087: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.04098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.04116: variable 'omit' from source: magic vars 27844 1726882767.04493: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.04510: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.04520: variable 'omit' from source: magic vars 27844 1726882767.04591: variable 'omit' from source: magic vars 27844 1726882767.04628: variable 'omit' from source: magic vars 27844 1726882767.04679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882767.04718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882767.04739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882767.04768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882767.04784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882767.04817: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882767.04825: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.04832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.04940: Set connection var ansible_shell_type to sh 27844 1726882767.04948: Set connection var ansible_connection to ssh 27844 1726882767.04956: Set connection var ansible_pipelining to False 27844 1726882767.04973: Set connection var ansible_timeout to 10 27844 1726882767.04984: Set connection var ansible_shell_executable to /bin/sh 27844 1726882767.04992: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882767.05026: variable 'ansible_shell_executable' from source: unknown 27844 1726882767.05033: variable 'ansible_connection' from source: unknown 27844 1726882767.05039: variable 'ansible_module_compression' from source: unknown 27844 1726882767.05045: variable 'ansible_shell_type' from source: unknown 27844 1726882767.05050: variable 'ansible_shell_executable' from source: unknown 27844 1726882767.05056: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.05062: variable 'ansible_pipelining' from source: unknown 27844 1726882767.05073: variable 'ansible_timeout' from source: unknown 27844 1726882767.05085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.05288: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882767.05308: variable 'omit' from source: magic vars 27844 1726882767.05317: starting attempt loop 27844 1726882767.05323: running the handler 27844 1726882767.05342: _low_level_execute_command(): starting 27844 1726882767.05353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882767.06146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.06159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.06182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.06200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.06243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.06254: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.06270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.06295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.06309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.06322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.06339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.06355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.06376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.06394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.06407: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.06421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.06507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.06525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.06540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.06689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.08343: stdout chunk (state=3): >>>/root <<< 27844 1726882767.08446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882767.08519: stderr chunk (state=3): >>><<< 27844 1726882767.08523: stdout chunk (state=3): >>><<< 27844 1726882767.08633: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882767.08637: _low_level_execute_command(): starting 27844 1726882767.08640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768 `" && echo ansible-tmp-1726882767.0854435-29074-13082113770768="` echo /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768 `" ) && sleep 0' 27844 1726882767.09256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.09275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.09290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.09314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.09355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.09370: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.09384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.09400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.09410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.09425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.09436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.09448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.09461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.09478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.09488: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.09500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.09585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.09605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.09619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.09744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.11618: stdout chunk (state=3): >>>ansible-tmp-1726882767.0854435-29074-13082113770768=/root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768 <<< 27844 1726882767.11807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882767.11810: stdout chunk (state=3): >>><<< 27844 1726882767.11812: stderr chunk (state=3): >>><<< 27844 1726882767.11878: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882767.0854435-29074-13082113770768=/root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882767.12144: variable 'ansible_module_compression' from source: unknown 27844 1726882767.12147: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27844 1726882767.12149: variable 'ansible_facts' from source: unknown 27844 1726882767.12152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/AnsiballZ_ping.py 27844 1726882767.12216: Sending initial data 27844 1726882767.12219: Sent initial data (152 bytes) 27844 1726882767.13229: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.13245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.13259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.13281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.13322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.13333: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.13349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.13371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.13383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.13393: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.13407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.13421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.13435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.13445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.13458: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.13477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.13551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.13575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.13589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.13713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.15447: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882767.15537: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882767.15634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpi7s__9tt /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/AnsiballZ_ping.py <<< 27844 1726882767.15726: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882767.16974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882767.17139: stderr chunk (state=3): >>><<< 27844 1726882767.17142: stdout chunk (state=3): >>><<< 27844 1726882767.17144: done transferring module to remote 27844 1726882767.17146: _low_level_execute_command(): starting 27844 1726882767.17153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/ /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/AnsiballZ_ping.py && sleep 0' 27844 1726882767.17722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.17735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.17747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.17763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.17807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.17818: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.17831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.17847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.17857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.17873: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.17885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.17898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.17913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.17923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.17933: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.17944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.18023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.18039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.18052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.18178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.19896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882767.19958: stderr chunk (state=3): >>><<< 27844 1726882767.19961: stdout chunk (state=3): >>><<< 27844 1726882767.20048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882767.20052: _low_level_execute_command(): starting 27844 1726882767.20055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/AnsiballZ_ping.py && sleep 0' 27844 1726882767.20618: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.20633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.20650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.20675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.20717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.20729: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.20744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.20762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.20782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.20794: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.20807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.20822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.20838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.20851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.20863: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.20883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.20956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.20978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.20993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.21123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.33819: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27844 1726882767.34778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882767.34850: stderr chunk (state=3): >>><<< 27844 1726882767.34853: stdout chunk (state=3): >>><<< 27844 1726882767.34875: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882767.34898: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882767.34905: _low_level_execute_command(): starting 27844 1726882767.34911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882767.0854435-29074-13082113770768/ > /dev/null 2>&1 && sleep 0' 27844 1726882767.35538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882767.35546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.35557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.35574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.35612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.35619: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882767.35629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.35643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882767.35651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882767.35657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882767.35669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882767.35680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882767.35691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882767.35698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882767.35702: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882767.35712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882767.35802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882767.35807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882767.35815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882767.35932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882767.37780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882767.37795: stderr chunk (state=3): >>><<< 27844 1726882767.37805: stdout chunk (state=3): >>><<< 27844 1726882767.37828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882767.37833: handler run complete 27844 1726882767.37850: attempt loop complete, returning result 27844 1726882767.37852: _execute() done 27844 1726882767.37855: dumping result to json 27844 1726882767.37857: done dumping result, returning 27844 1726882767.37870: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-efa9-466a-000000000081] 27844 1726882767.37873: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000081 27844 1726882767.37968: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000081 27844 1726882767.37971: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 27844 1726882767.38151: no more pending results, returning what we have 27844 1726882767.38155: results queue empty 27844 1726882767.38156: checking for any_errors_fatal 27844 1726882767.38162: done checking for any_errors_fatal 27844 1726882767.38164: checking for max_fail_percentage 27844 1726882767.38166: done checking for max_fail_percentage 27844 1726882767.38168: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.38168: done checking to see if all hosts have failed 27844 1726882767.38169: getting the remaining hosts for this loop 27844 1726882767.38171: done getting the remaining hosts for this loop 27844 1726882767.38175: getting the next task for host managed_node1 27844 1726882767.38183: done getting next task for host managed_node1 27844 1726882767.38186: ^ task is: TASK: meta (role_complete) 27844 1726882767.38189: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882767.38202: getting variables 27844 1726882767.38204: in VariableManager get_vars() 27844 1726882767.38249: Calling all_inventory to load vars for managed_node1 27844 1726882767.38252: Calling groups_inventory to load vars for managed_node1 27844 1726882767.38255: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.38268: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.38271: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.38275: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.44166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.45876: done with get_vars() 27844 1726882767.45901: done getting variables 27844 1726882767.45966: done queuing things up, now waiting for results queue to drain 27844 1726882767.45968: results queue empty 27844 1726882767.45968: checking for any_errors_fatal 27844 1726882767.45971: done checking for any_errors_fatal 27844 1726882767.45972: checking for max_fail_percentage 27844 1726882767.45973: done checking for max_fail_percentage 27844 1726882767.45974: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.45975: done checking to see if all hosts have failed 27844 1726882767.45975: getting the remaining hosts for this loop 27844 1726882767.45976: done getting the remaining hosts for this loop 27844 1726882767.45979: getting the next task for host managed_node1 27844 1726882767.45982: done getting next task for host managed_node1 27844 1726882767.45984: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27844 1726882767.45986: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882767.45996: getting variables 27844 1726882767.45997: in VariableManager get_vars() 27844 1726882767.46010: Calling all_inventory to load vars for managed_node1 27844 1726882767.46012: Calling groups_inventory to load vars for managed_node1 27844 1726882767.46015: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.46019: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.46022: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.46024: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.47244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.48940: done with get_vars() 27844 1726882767.48959: done getting variables 27844 1726882767.49003: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Friday 20 September 2024 21:39:27 -0400 (0:00:00.458) 0:00:26.566 ****** 27844 1726882767.49033: entering _queue_task() for managed_node1/assert 27844 1726882767.49356: worker is 1 (out of 1 available) 27844 1726882767.49370: exiting _queue_task() for managed_node1/assert 27844 1726882767.49384: done queuing things up, now waiting for results queue to drain 27844 1726882767.49385: waiting for pending results... 27844 1726882767.49687: running TaskExecutor() for managed_node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27844 1726882767.49779: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b1 27844 1726882767.49799: variable 'ansible_search_path' from source: unknown 27844 1726882767.49841: calling self._execute() 27844 1726882767.49955: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.49959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.49971: variable 'omit' from source: magic vars 27844 1726882767.50374: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.50387: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.50501: variable 'network_provider' from source: set_fact 27844 1726882767.50507: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882767.50510: when evaluation is False, skipping this task 27844 1726882767.50513: _execute() done 27844 1726882767.50516: dumping result to json 27844 1726882767.50518: done dumping result, returning 27844 1726882767.50525: done running TaskExecutor() for managed_node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [0e448fcc-3ce9-efa9-466a-0000000000b1] 27844 1726882767.50529: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b1 27844 1726882767.50626: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b1 27844 1726882767.50630: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27844 1726882767.50679: no more pending results, returning what we have 27844 1726882767.50684: results queue empty 27844 1726882767.50685: checking for any_errors_fatal 27844 1726882767.50687: done checking for any_errors_fatal 27844 1726882767.50688: checking for max_fail_percentage 27844 1726882767.50689: done checking for max_fail_percentage 27844 1726882767.50690: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.50691: done checking to see if all hosts have failed 27844 1726882767.50692: getting the remaining hosts for this loop 27844 1726882767.50694: done getting the remaining hosts for this loop 27844 1726882767.50697: getting the next task for host managed_node1 27844 1726882767.50702: done getting next task for host managed_node1 27844 1726882767.50705: ^ task is: TASK: Assert that no warning is logged for nm provider 27844 1726882767.50707: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882767.50710: getting variables 27844 1726882767.50712: in VariableManager get_vars() 27844 1726882767.50751: Calling all_inventory to load vars for managed_node1 27844 1726882767.50753: Calling groups_inventory to load vars for managed_node1 27844 1726882767.50756: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.50769: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.50772: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.50775: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.52508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.54202: done with get_vars() 27844 1726882767.54226: done getting variables 27844 1726882767.54282: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Friday 20 September 2024 21:39:27 -0400 (0:00:00.052) 0:00:26.619 ****** 27844 1726882767.54309: entering _queue_task() for managed_node1/assert 27844 1726882767.54569: worker is 1 (out of 1 available) 27844 1726882767.54580: exiting _queue_task() for managed_node1/assert 27844 1726882767.54591: done queuing things up, now waiting for results queue to drain 27844 1726882767.54593: waiting for pending results... 27844 1726882767.54868: running TaskExecutor() for managed_node1/TASK: Assert that no warning is logged for nm provider 27844 1726882767.54947: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b2 27844 1726882767.54960: variable 'ansible_search_path' from source: unknown 27844 1726882767.54998: calling self._execute() 27844 1726882767.55099: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.55102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.55110: variable 'omit' from source: magic vars 27844 1726882767.55498: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.55509: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.55626: variable 'network_provider' from source: set_fact 27844 1726882767.55636: Evaluated conditional (network_provider == "nm"): True 27844 1726882767.55643: variable 'omit' from source: magic vars 27844 1726882767.55665: variable 'omit' from source: magic vars 27844 1726882767.55706: variable 'omit' from source: magic vars 27844 1726882767.55746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882767.55781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882767.55802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882767.55818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882767.55829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882767.55859: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882767.55862: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.55869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.55971: Set connection var ansible_shell_type to sh 27844 1726882767.55974: Set connection var ansible_connection to ssh 27844 1726882767.55980: Set connection var ansible_pipelining to False 27844 1726882767.55986: Set connection var ansible_timeout to 10 27844 1726882767.55992: Set connection var ansible_shell_executable to /bin/sh 27844 1726882767.55997: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882767.56027: variable 'ansible_shell_executable' from source: unknown 27844 1726882767.56031: variable 'ansible_connection' from source: unknown 27844 1726882767.56033: variable 'ansible_module_compression' from source: unknown 27844 1726882767.56036: variable 'ansible_shell_type' from source: unknown 27844 1726882767.56038: variable 'ansible_shell_executable' from source: unknown 27844 1726882767.56040: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.56042: variable 'ansible_pipelining' from source: unknown 27844 1726882767.56045: variable 'ansible_timeout' from source: unknown 27844 1726882767.56050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.56186: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882767.56196: variable 'omit' from source: magic vars 27844 1726882767.56201: starting attempt loop 27844 1726882767.56204: running the handler 27844 1726882767.56371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882767.56613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882767.56676: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882767.56708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882767.56744: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882767.56845: variable '__network_connections_result' from source: set_fact 27844 1726882767.56870: Evaluated conditional (__network_connections_result.stderr is not search("")): True 27844 1726882767.56873: handler run complete 27844 1726882767.56893: attempt loop complete, returning result 27844 1726882767.56896: _execute() done 27844 1726882767.56900: dumping result to json 27844 1726882767.56903: done dumping result, returning 27844 1726882767.56905: done running TaskExecutor() for managed_node1/TASK: Assert that no warning is logged for nm provider [0e448fcc-3ce9-efa9-466a-0000000000b2] 27844 1726882767.56911: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b2 27844 1726882767.57008: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b2 27844 1726882767.57012: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882767.57059: no more pending results, returning what we have 27844 1726882767.57062: results queue empty 27844 1726882767.57065: checking for any_errors_fatal 27844 1726882767.57073: done checking for any_errors_fatal 27844 1726882767.57073: checking for max_fail_percentage 27844 1726882767.57076: done checking for max_fail_percentage 27844 1726882767.57077: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.57078: done checking to see if all hosts have failed 27844 1726882767.57078: getting the remaining hosts for this loop 27844 1726882767.57080: done getting the remaining hosts for this loop 27844 1726882767.57083: getting the next task for host managed_node1 27844 1726882767.57092: done getting next task for host managed_node1 27844 1726882767.57096: ^ task is: TASK: Bring down test devices and profiles 27844 1726882767.57098: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882767.57103: getting variables 27844 1726882767.57105: in VariableManager get_vars() 27844 1726882767.57144: Calling all_inventory to load vars for managed_node1 27844 1726882767.57147: Calling groups_inventory to load vars for managed_node1 27844 1726882767.57149: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.57160: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.57165: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.57168: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.58697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.60554: done with get_vars() 27844 1726882767.60576: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Friday 20 September 2024 21:39:27 -0400 (0:00:00.063) 0:00:26.683 ****** 27844 1726882767.60676: entering _queue_task() for managed_node1/include_role 27844 1726882767.60678: Creating lock for include_role 27844 1726882767.60940: worker is 1 (out of 1 available) 27844 1726882767.60951: exiting _queue_task() for managed_node1/include_role 27844 1726882767.60962: done queuing things up, now waiting for results queue to drain 27844 1726882767.60965: waiting for pending results... 27844 1726882767.61252: running TaskExecutor() for managed_node1/TASK: Bring down test devices and profiles 27844 1726882767.61357: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b4 27844 1726882767.61372: variable 'ansible_search_path' from source: unknown 27844 1726882767.61414: calling self._execute() 27844 1726882767.61510: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.61520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.61531: variable 'omit' from source: magic vars 27844 1726882767.61918: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.61934: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.61940: _execute() done 27844 1726882767.61944: dumping result to json 27844 1726882767.61946: done dumping result, returning 27844 1726882767.61958: done running TaskExecutor() for managed_node1/TASK: Bring down test devices and profiles [0e448fcc-3ce9-efa9-466a-0000000000b4] 27844 1726882767.61964: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b4 27844 1726882767.62107: no more pending results, returning what we have 27844 1726882767.62112: in VariableManager get_vars() 27844 1726882767.62156: Calling all_inventory to load vars for managed_node1 27844 1726882767.62158: Calling groups_inventory to load vars for managed_node1 27844 1726882767.62161: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.62178: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.62181: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.62185: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.62705: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b4 27844 1726882767.62708: WORKER PROCESS EXITING 27844 1726882767.63702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.65423: done with get_vars() 27844 1726882767.65441: variable 'ansible_search_path' from source: unknown 27844 1726882767.65713: variable 'omit' from source: magic vars 27844 1726882767.65746: variable 'omit' from source: magic vars 27844 1726882767.65760: variable 'omit' from source: magic vars 27844 1726882767.65766: we have included files to process 27844 1726882767.65767: generating all_blocks data 27844 1726882767.65769: done generating all_blocks data 27844 1726882767.65772: processing included file: fedora.linux_system_roles.network 27844 1726882767.65795: in VariableManager get_vars() 27844 1726882767.65813: done with get_vars() 27844 1726882767.65846: in VariableManager get_vars() 27844 1726882767.65868: done with get_vars() 27844 1726882767.65912: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27844 1726882767.66044: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27844 1726882767.66126: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27844 1726882767.66633: in VariableManager get_vars() 27844 1726882767.66657: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882767.68719: iterating over new_blocks loaded from include file 27844 1726882767.68721: in VariableManager get_vars() 27844 1726882767.68739: done with get_vars() 27844 1726882767.68741: filtering new block on tags 27844 1726882767.68995: done filtering new block on tags 27844 1726882767.68998: in VariableManager get_vars() 27844 1726882767.69016: done with get_vars() 27844 1726882767.69017: filtering new block on tags 27844 1726882767.69034: done filtering new block on tags 27844 1726882767.69036: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node1 27844 1726882767.69041: extending task lists for all hosts with included blocks 27844 1726882767.69264: done extending task lists 27844 1726882767.69266: done processing included files 27844 1726882767.69267: results queue empty 27844 1726882767.69267: checking for any_errors_fatal 27844 1726882767.69271: done checking for any_errors_fatal 27844 1726882767.69271: checking for max_fail_percentage 27844 1726882767.69272: done checking for max_fail_percentage 27844 1726882767.69273: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.69274: done checking to see if all hosts have failed 27844 1726882767.69275: getting the remaining hosts for this loop 27844 1726882767.69276: done getting the remaining hosts for this loop 27844 1726882767.69278: getting the next task for host managed_node1 27844 1726882767.69282: done getting next task for host managed_node1 27844 1726882767.69285: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882767.69288: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882767.69297: getting variables 27844 1726882767.69298: in VariableManager get_vars() 27844 1726882767.69318: Calling all_inventory to load vars for managed_node1 27844 1726882767.69320: Calling groups_inventory to load vars for managed_node1 27844 1726882767.69322: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.69328: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.69330: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.69333: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.70553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.72247: done with get_vars() 27844 1726882767.72271: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:27 -0400 (0:00:00.116) 0:00:26.800 ****** 27844 1726882767.72352: entering _queue_task() for managed_node1/include_tasks 27844 1726882767.72654: worker is 1 (out of 1 available) 27844 1726882767.72670: exiting _queue_task() for managed_node1/include_tasks 27844 1726882767.72684: done queuing things up, now waiting for results queue to drain 27844 1726882767.72685: waiting for pending results... 27844 1726882767.72977: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27844 1726882767.73088: in run() - task 0e448fcc-3ce9-efa9-466a-000000000641 27844 1726882767.73100: variable 'ansible_search_path' from source: unknown 27844 1726882767.73103: variable 'ansible_search_path' from source: unknown 27844 1726882767.73144: calling self._execute() 27844 1726882767.73237: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.73241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.73251: variable 'omit' from source: magic vars 27844 1726882767.73628: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.73640: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.73650: _execute() done 27844 1726882767.73654: dumping result to json 27844 1726882767.73657: done dumping result, returning 27844 1726882767.73671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-efa9-466a-000000000641] 27844 1726882767.73675: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000641 27844 1726882767.73761: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000641 27844 1726882767.73768: WORKER PROCESS EXITING 27844 1726882767.73813: no more pending results, returning what we have 27844 1726882767.73818: in VariableManager get_vars() 27844 1726882767.73865: Calling all_inventory to load vars for managed_node1 27844 1726882767.73868: Calling groups_inventory to load vars for managed_node1 27844 1726882767.73873: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.73887: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.73890: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.73894: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.75600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.77341: done with get_vars() 27844 1726882767.77361: variable 'ansible_search_path' from source: unknown 27844 1726882767.77362: variable 'ansible_search_path' from source: unknown 27844 1726882767.77403: we have included files to process 27844 1726882767.77405: generating all_blocks data 27844 1726882767.77406: done generating all_blocks data 27844 1726882767.77409: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882767.77410: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882767.77412: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27844 1726882767.78000: done processing included file 27844 1726882767.78002: iterating over new_blocks loaded from include file 27844 1726882767.78004: in VariableManager get_vars() 27844 1726882767.78029: done with get_vars() 27844 1726882767.78031: filtering new block on tags 27844 1726882767.78057: done filtering new block on tags 27844 1726882767.78060: in VariableManager get_vars() 27844 1726882767.78089: done with get_vars() 27844 1726882767.78090: filtering new block on tags 27844 1726882767.78128: done filtering new block on tags 27844 1726882767.78131: in VariableManager get_vars() 27844 1726882767.78154: done with get_vars() 27844 1726882767.78156: filtering new block on tags 27844 1726882767.78197: done filtering new block on tags 27844 1726882767.78199: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 27844 1726882767.78204: extending task lists for all hosts with included blocks 27844 1726882767.79248: done extending task lists 27844 1726882767.79249: done processing included files 27844 1726882767.79250: results queue empty 27844 1726882767.79251: checking for any_errors_fatal 27844 1726882767.79254: done checking for any_errors_fatal 27844 1726882767.79254: checking for max_fail_percentage 27844 1726882767.79255: done checking for max_fail_percentage 27844 1726882767.79261: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.79262: done checking to see if all hosts have failed 27844 1726882767.79262: getting the remaining hosts for this loop 27844 1726882767.79264: done getting the remaining hosts for this loop 27844 1726882767.79266: getting the next task for host managed_node1 27844 1726882767.79272: done getting next task for host managed_node1 27844 1726882767.79274: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882767.79278: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882767.79287: getting variables 27844 1726882767.79288: in VariableManager get_vars() 27844 1726882767.79303: Calling all_inventory to load vars for managed_node1 27844 1726882767.79305: Calling groups_inventory to load vars for managed_node1 27844 1726882767.79307: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.79312: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.79314: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.79317: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.80605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.82597: done with get_vars() 27844 1726882767.82615: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:27 -0400 (0:00:00.103) 0:00:26.903 ****** 27844 1726882767.82692: entering _queue_task() for managed_node1/setup 27844 1726882767.83014: worker is 1 (out of 1 available) 27844 1726882767.83026: exiting _queue_task() for managed_node1/setup 27844 1726882767.83039: done queuing things up, now waiting for results queue to drain 27844 1726882767.83041: waiting for pending results... 27844 1726882767.83353: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27844 1726882767.83547: in run() - task 0e448fcc-3ce9-efa9-466a-0000000006a7 27844 1726882767.83559: variable 'ansible_search_path' from source: unknown 27844 1726882767.83563: variable 'ansible_search_path' from source: unknown 27844 1726882767.83599: calling self._execute() 27844 1726882767.83695: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.83700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.83709: variable 'omit' from source: magic vars 27844 1726882767.84087: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.84099: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.84454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882767.87085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882767.87216: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882767.87252: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882767.87287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882767.87313: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882767.87395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882767.87423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882767.87457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882767.87497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882767.87510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882767.87561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882767.87583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882767.87605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882767.87644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882767.87661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882767.87822: variable '__network_required_facts' from source: role '' defaults 27844 1726882767.87830: variable 'ansible_facts' from source: unknown 27844 1726882767.88667: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27844 1726882767.88673: when evaluation is False, skipping this task 27844 1726882767.88676: _execute() done 27844 1726882767.88678: dumping result to json 27844 1726882767.88681: done dumping result, returning 27844 1726882767.88683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-efa9-466a-0000000006a7] 27844 1726882767.88689: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006a7 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882767.88827: no more pending results, returning what we have 27844 1726882767.88832: results queue empty 27844 1726882767.88835: checking for any_errors_fatal 27844 1726882767.88837: done checking for any_errors_fatal 27844 1726882767.88838: checking for max_fail_percentage 27844 1726882767.88840: done checking for max_fail_percentage 27844 1726882767.88841: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.88841: done checking to see if all hosts have failed 27844 1726882767.88842: getting the remaining hosts for this loop 27844 1726882767.88845: done getting the remaining hosts for this loop 27844 1726882767.88848: getting the next task for host managed_node1 27844 1726882767.88859: done getting next task for host managed_node1 27844 1726882767.88865: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882767.88871: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882767.88885: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006a7 27844 1726882767.88903: getting variables 27844 1726882767.88906: in VariableManager get_vars() 27844 1726882767.88950: Calling all_inventory to load vars for managed_node1 27844 1726882767.88954: Calling groups_inventory to load vars for managed_node1 27844 1726882767.88957: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.88971: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.88975: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.88979: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.89545: WORKER PROCESS EXITING 27844 1726882767.90708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.92453: done with get_vars() 27844 1726882767.92480: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:27 -0400 (0:00:00.098) 0:00:27.002 ****** 27844 1726882767.92584: entering _queue_task() for managed_node1/stat 27844 1726882767.92852: worker is 1 (out of 1 available) 27844 1726882767.92865: exiting _queue_task() for managed_node1/stat 27844 1726882767.92880: done queuing things up, now waiting for results queue to drain 27844 1726882767.92882: waiting for pending results... 27844 1726882767.93178: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 27844 1726882767.93304: in run() - task 0e448fcc-3ce9-efa9-466a-0000000006a9 27844 1726882767.93316: variable 'ansible_search_path' from source: unknown 27844 1726882767.93320: variable 'ansible_search_path' from source: unknown 27844 1726882767.93368: calling self._execute() 27844 1726882767.93462: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.93470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.93479: variable 'omit' from source: magic vars 27844 1726882767.93850: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.93862: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882767.94043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882767.94677: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882767.94681: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882767.94684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882767.94686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882767.94689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882767.94692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882767.94695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882767.94698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882767.94700: variable '__network_is_ostree' from source: set_fact 27844 1726882767.94702: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882767.94705: when evaluation is False, skipping this task 27844 1726882767.94707: _execute() done 27844 1726882767.94709: dumping result to json 27844 1726882767.94712: done dumping result, returning 27844 1726882767.94714: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-efa9-466a-0000000006a9] 27844 1726882767.94716: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006a9 27844 1726882767.94810: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006a9 27844 1726882767.94813: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882767.94899: no more pending results, returning what we have 27844 1726882767.94904: results queue empty 27844 1726882767.94905: checking for any_errors_fatal 27844 1726882767.94912: done checking for any_errors_fatal 27844 1726882767.94912: checking for max_fail_percentage 27844 1726882767.94914: done checking for max_fail_percentage 27844 1726882767.94915: checking to see if all hosts have failed and the running result is not ok 27844 1726882767.94916: done checking to see if all hosts have failed 27844 1726882767.94917: getting the remaining hosts for this loop 27844 1726882767.94919: done getting the remaining hosts for this loop 27844 1726882767.94922: getting the next task for host managed_node1 27844 1726882767.94928: done getting next task for host managed_node1 27844 1726882767.94932: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882767.94937: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882767.94957: getting variables 27844 1726882767.94959: in VariableManager get_vars() 27844 1726882767.95002: Calling all_inventory to load vars for managed_node1 27844 1726882767.95005: Calling groups_inventory to load vars for managed_node1 27844 1726882767.95008: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882767.95018: Calling all_plugins_play to load vars for managed_node1 27844 1726882767.95021: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882767.95024: Calling groups_plugins_play to load vars for managed_node1 27844 1726882767.96733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882767.98462: done with get_vars() 27844 1726882767.98482: done getting variables 27844 1726882767.98546: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:27 -0400 (0:00:00.059) 0:00:27.062 ****** 27844 1726882767.98584: entering _queue_task() for managed_node1/set_fact 27844 1726882767.98870: worker is 1 (out of 1 available) 27844 1726882767.98881: exiting _queue_task() for managed_node1/set_fact 27844 1726882767.98895: done queuing things up, now waiting for results queue to drain 27844 1726882767.98897: waiting for pending results... 27844 1726882767.99202: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27844 1726882767.99335: in run() - task 0e448fcc-3ce9-efa9-466a-0000000006aa 27844 1726882767.99353: variable 'ansible_search_path' from source: unknown 27844 1726882767.99357: variable 'ansible_search_path' from source: unknown 27844 1726882767.99393: calling self._execute() 27844 1726882767.99492: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882767.99495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882767.99511: variable 'omit' from source: magic vars 27844 1726882767.99879: variable 'ansible_distribution_major_version' from source: facts 27844 1726882767.99896: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882768.00071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882768.00338: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882768.00385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882768.00418: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882768.00453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882768.00534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882768.00569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882768.00593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882768.00624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882768.00716: variable '__network_is_ostree' from source: set_fact 27844 1726882768.00722: Evaluated conditional (not __network_is_ostree is defined): False 27844 1726882768.00725: when evaluation is False, skipping this task 27844 1726882768.00729: _execute() done 27844 1726882768.00732: dumping result to json 27844 1726882768.00734: done dumping result, returning 27844 1726882768.00737: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-efa9-466a-0000000006aa] 27844 1726882768.00743: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006aa 27844 1726882768.00828: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006aa 27844 1726882768.00831: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27844 1726882768.00911: no more pending results, returning what we have 27844 1726882768.00917: results queue empty 27844 1726882768.00920: checking for any_errors_fatal 27844 1726882768.00929: done checking for any_errors_fatal 27844 1726882768.00929: checking for max_fail_percentage 27844 1726882768.00931: done checking for max_fail_percentage 27844 1726882768.00932: checking to see if all hosts have failed and the running result is not ok 27844 1726882768.00933: done checking to see if all hosts have failed 27844 1726882768.00934: getting the remaining hosts for this loop 27844 1726882768.00936: done getting the remaining hosts for this loop 27844 1726882768.00939: getting the next task for host managed_node1 27844 1726882768.00949: done getting next task for host managed_node1 27844 1726882768.00953: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882768.00958: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882768.00979: getting variables 27844 1726882768.00982: in VariableManager get_vars() 27844 1726882768.01023: Calling all_inventory to load vars for managed_node1 27844 1726882768.01026: Calling groups_inventory to load vars for managed_node1 27844 1726882768.01029: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882768.01039: Calling all_plugins_play to load vars for managed_node1 27844 1726882768.01042: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882768.01045: Calling groups_plugins_play to load vars for managed_node1 27844 1726882768.02602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882768.04311: done with get_vars() 27844 1726882768.04330: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:28 -0400 (0:00:00.058) 0:00:27.120 ****** 27844 1726882768.04428: entering _queue_task() for managed_node1/service_facts 27844 1726882768.04671: worker is 1 (out of 1 available) 27844 1726882768.04684: exiting _queue_task() for managed_node1/service_facts 27844 1726882768.04696: done queuing things up, now waiting for results queue to drain 27844 1726882768.04697: waiting for pending results... 27844 1726882768.05130: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 27844 1726882768.05247: in run() - task 0e448fcc-3ce9-efa9-466a-0000000006ac 27844 1726882768.05270: variable 'ansible_search_path' from source: unknown 27844 1726882768.05274: variable 'ansible_search_path' from source: unknown 27844 1726882768.05309: calling self._execute() 27844 1726882768.05517: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882768.05520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882768.05526: variable 'omit' from source: magic vars 27844 1726882768.06303: variable 'ansible_distribution_major_version' from source: facts 27844 1726882768.06315: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882768.06321: variable 'omit' from source: magic vars 27844 1726882768.06516: variable 'omit' from source: magic vars 27844 1726882768.06546: variable 'omit' from source: magic vars 27844 1726882768.06705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882768.06738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882768.06757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882768.06778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882768.06832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882768.06860: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882768.06868: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882768.06871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882768.06996: Set connection var ansible_shell_type to sh 27844 1726882768.07000: Set connection var ansible_connection to ssh 27844 1726882768.07009: Set connection var ansible_pipelining to False 27844 1726882768.07016: Set connection var ansible_timeout to 10 27844 1726882768.07024: Set connection var ansible_shell_executable to /bin/sh 27844 1726882768.07027: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882768.07058: variable 'ansible_shell_executable' from source: unknown 27844 1726882768.07062: variable 'ansible_connection' from source: unknown 27844 1726882768.07069: variable 'ansible_module_compression' from source: unknown 27844 1726882768.07072: variable 'ansible_shell_type' from source: unknown 27844 1726882768.07076: variable 'ansible_shell_executable' from source: unknown 27844 1726882768.07078: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882768.07080: variable 'ansible_pipelining' from source: unknown 27844 1726882768.07083: variable 'ansible_timeout' from source: unknown 27844 1726882768.07085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882768.07294: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882768.07305: variable 'omit' from source: magic vars 27844 1726882768.07308: starting attempt loop 27844 1726882768.07311: running the handler 27844 1726882768.07325: _low_level_execute_command(): starting 27844 1726882768.07332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882768.08061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882768.08076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.08088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.08107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.08149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882768.08157: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882768.08171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.08183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882768.08193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882768.08199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882768.08212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.08222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.08240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.08248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882768.08256: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882768.08271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.08344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882768.08369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882768.08380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882768.08506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882768.10177: stdout chunk (state=3): >>>/root <<< 27844 1726882768.10333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882768.10338: stdout chunk (state=3): >>><<< 27844 1726882768.10347: stderr chunk (state=3): >>><<< 27844 1726882768.10368: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882768.10381: _low_level_execute_command(): starting 27844 1726882768.10386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313 `" && echo ansible-tmp-1726882768.103668-29108-180061790744313="` echo /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313 `" ) && sleep 0' 27844 1726882768.12093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882768.12098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.12121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.12235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882768.12252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.12288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882768.12293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.12320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.12334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882768.12337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.12458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882768.12489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882768.12505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882768.12645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882768.14492: stdout chunk (state=3): >>>ansible-tmp-1726882768.103668-29108-180061790744313=/root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313 <<< 27844 1726882768.14681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882768.14684: stdout chunk (state=3): >>><<< 27844 1726882768.14687: stderr chunk (state=3): >>><<< 27844 1726882768.14772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882768.103668-29108-180061790744313=/root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882768.14776: variable 'ansible_module_compression' from source: unknown 27844 1726882768.14871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27844 1726882768.14874: variable 'ansible_facts' from source: unknown 27844 1726882768.14946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/AnsiballZ_service_facts.py 27844 1726882768.15560: Sending initial data 27844 1726882768.15569: Sent initial data (161 bytes) 27844 1726882768.16517: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.16521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.16558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.16561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.16568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.16633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882768.16646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882768.16771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882768.18528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882768.18619: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882768.18714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmppgdq_e2x /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/AnsiballZ_service_facts.py <<< 27844 1726882768.18807: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882768.20285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882768.20476: stderr chunk (state=3): >>><<< 27844 1726882768.20479: stdout chunk (state=3): >>><<< 27844 1726882768.20482: done transferring module to remote 27844 1726882768.20484: _low_level_execute_command(): starting 27844 1726882768.20487: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/ /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/AnsiballZ_service_facts.py && sleep 0' 27844 1726882768.21083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882768.21098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.21113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.21138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.21183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882768.21195: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882768.21214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.21238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882768.21251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882768.21262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882768.21282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.21297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.21313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.21327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882768.21341: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882768.21356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.21426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882768.21430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882768.21545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882768.23301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882768.23384: stderr chunk (state=3): >>><<< 27844 1726882768.23387: stdout chunk (state=3): >>><<< 27844 1726882768.23399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882768.23402: _low_level_execute_command(): starting 27844 1726882768.23405: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/AnsiballZ_service_facts.py && sleep 0' 27844 1726882768.23930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882768.23934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882768.23945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.23959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882768.24008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882768.24248: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.24254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882768.24283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882768.24288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882768.24352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882768.24356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882768.24374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882768.24514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.55771: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 27844 1726882769.55813: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 27844 1726882769.55821: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 27844 1726882769.55824: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper<<< 27844 1726882769.55828: stdout chunk (state=3): >>>-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "<<< 27844 1726882769.55831: stdout chunk (state=3): >>>source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27844 1726882769.57080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882769.57171: stderr chunk (state=3): >>><<< 27844 1726882769.57175: stdout chunk (state=3): >>><<< 27844 1726882769.57476: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882769.58167: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882769.58184: _low_level_execute_command(): starting 27844 1726882769.58194: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882768.103668-29108-180061790744313/ > /dev/null 2>&1 && sleep 0' 27844 1726882769.58806: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882769.58819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.58831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.58845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.58887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.58898: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882769.58911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.58925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882769.58935: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882769.58943: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882769.58953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.58965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.58979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.58989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.58997: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882769.59008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.59084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882769.59107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.59122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.59242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.61049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882769.61151: stderr chunk (state=3): >>><<< 27844 1726882769.61161: stdout chunk (state=3): >>><<< 27844 1726882769.61275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882769.61278: handler run complete 27844 1726882769.61474: variable 'ansible_facts' from source: unknown 27844 1726882769.61570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882769.62094: variable 'ansible_facts' from source: unknown 27844 1726882769.62250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882769.62499: attempt loop complete, returning result 27844 1726882769.62509: _execute() done 27844 1726882769.62516: dumping result to json 27844 1726882769.62595: done dumping result, returning 27844 1726882769.62609: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-efa9-466a-0000000006ac] 27844 1726882769.62619: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006ac ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882769.63678: no more pending results, returning what we have 27844 1726882769.63681: results queue empty 27844 1726882769.63683: checking for any_errors_fatal 27844 1726882769.63686: done checking for any_errors_fatal 27844 1726882769.63687: checking for max_fail_percentage 27844 1726882769.63689: done checking for max_fail_percentage 27844 1726882769.63689: checking to see if all hosts have failed and the running result is not ok 27844 1726882769.63690: done checking to see if all hosts have failed 27844 1726882769.63691: getting the remaining hosts for this loop 27844 1726882769.63693: done getting the remaining hosts for this loop 27844 1726882769.63698: getting the next task for host managed_node1 27844 1726882769.63705: done getting next task for host managed_node1 27844 1726882769.63709: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882769.63715: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882769.63725: getting variables 27844 1726882769.63727: in VariableManager get_vars() 27844 1726882769.63770: Calling all_inventory to load vars for managed_node1 27844 1726882769.63774: Calling groups_inventory to load vars for managed_node1 27844 1726882769.63777: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882769.63793: Calling all_plugins_play to load vars for managed_node1 27844 1726882769.63796: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882769.63799: Calling groups_plugins_play to load vars for managed_node1 27844 1726882769.64927: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006ac 27844 1726882769.64930: WORKER PROCESS EXITING 27844 1726882769.65751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882769.67784: done with get_vars() 27844 1726882769.67802: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:29 -0400 (0:00:01.634) 0:00:28.755 ****** 27844 1726882769.67877: entering _queue_task() for managed_node1/package_facts 27844 1726882769.68111: worker is 1 (out of 1 available) 27844 1726882769.68124: exiting _queue_task() for managed_node1/package_facts 27844 1726882769.68137: done queuing things up, now waiting for results queue to drain 27844 1726882769.68138: waiting for pending results... 27844 1726882769.68343: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 27844 1726882769.68446: in run() - task 0e448fcc-3ce9-efa9-466a-0000000006ad 27844 1726882769.68461: variable 'ansible_search_path' from source: unknown 27844 1726882769.68464: variable 'ansible_search_path' from source: unknown 27844 1726882769.68495: calling self._execute() 27844 1726882769.68566: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882769.68572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882769.68586: variable 'omit' from source: magic vars 27844 1726882769.68851: variable 'ansible_distribution_major_version' from source: facts 27844 1726882769.68862: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882769.68872: variable 'omit' from source: magic vars 27844 1726882769.68919: variable 'omit' from source: magic vars 27844 1726882769.68940: variable 'omit' from source: magic vars 27844 1726882769.68978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882769.69004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882769.69021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882769.69034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882769.69044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882769.69067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882769.69074: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882769.69077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882769.69145: Set connection var ansible_shell_type to sh 27844 1726882769.69148: Set connection var ansible_connection to ssh 27844 1726882769.69153: Set connection var ansible_pipelining to False 27844 1726882769.69159: Set connection var ansible_timeout to 10 27844 1726882769.69165: Set connection var ansible_shell_executable to /bin/sh 27844 1726882769.69173: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882769.69192: variable 'ansible_shell_executable' from source: unknown 27844 1726882769.69196: variable 'ansible_connection' from source: unknown 27844 1726882769.69199: variable 'ansible_module_compression' from source: unknown 27844 1726882769.69204: variable 'ansible_shell_type' from source: unknown 27844 1726882769.69539: variable 'ansible_shell_executable' from source: unknown 27844 1726882769.69542: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882769.69544: variable 'ansible_pipelining' from source: unknown 27844 1726882769.69547: variable 'ansible_timeout' from source: unknown 27844 1726882769.69549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882769.69552: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882769.69555: variable 'omit' from source: magic vars 27844 1726882769.69668: starting attempt loop 27844 1726882769.69671: running the handler 27844 1726882769.69685: _low_level_execute_command(): starting 27844 1726882769.69693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882769.71091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882769.71107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.71131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.71135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.71181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882769.71188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.71198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.71306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.73369: stdout chunk (state=3): >>>/root <<< 27844 1726882769.74121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882769.74196: stderr chunk (state=3): >>><<< 27844 1726882769.74224: stdout chunk (state=3): >>><<< 27844 1726882769.74355: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882769.74358: _low_level_execute_command(): starting 27844 1726882769.74361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050 `" && echo ansible-tmp-1726882769.7425897-29164-198628690403050="` echo /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050 `" ) && sleep 0' 27844 1726882769.75015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.75019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.75055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.75078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.75139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.75154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.75283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.77991: stdout chunk (state=3): >>>ansible-tmp-1726882769.7425897-29164-198628690403050=/root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050 <<< 27844 1726882769.78012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882769.78108: stderr chunk (state=3): >>><<< 27844 1726882769.78122: stdout chunk (state=3): >>><<< 27844 1726882769.78774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882769.7425897-29164-198628690403050=/root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882769.78778: variable 'ansible_module_compression' from source: unknown 27844 1726882769.78780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27844 1726882769.78782: variable 'ansible_facts' from source: unknown 27844 1726882769.79521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/AnsiballZ_package_facts.py 27844 1726882769.79700: Sending initial data 27844 1726882769.79703: Sent initial data (162 bytes) 27844 1726882769.80769: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882769.80784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.80797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.80819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.80860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.80878: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882769.80891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.80908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882769.80918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882769.80935: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882769.80946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.80958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.80975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.80987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.80997: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882769.81009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.81121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882769.81139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.81158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.81381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.83143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882769.83232: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882769.83332: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmph9keliyi /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/AnsiballZ_package_facts.py <<< 27844 1726882769.83417: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882769.86372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882769.86471: stderr chunk (state=3): >>><<< 27844 1726882769.86475: stdout chunk (state=3): >>><<< 27844 1726882769.86593: done transferring module to remote 27844 1726882769.86596: _low_level_execute_command(): starting 27844 1726882769.86599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/ /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/AnsiballZ_package_facts.py && sleep 0' 27844 1726882769.87206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882769.87222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.87239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.87269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.87311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.87324: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882769.87339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.87373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882769.87389: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882769.87407: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882769.87421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.87435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.87451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.87462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.87485: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882769.87502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.87586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882769.87606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.87621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.87743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882769.89525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882769.89570: stderr chunk (state=3): >>><<< 27844 1726882769.89573: stdout chunk (state=3): >>><<< 27844 1726882769.89582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882769.89585: _low_level_execute_command(): starting 27844 1726882769.89591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/AnsiballZ_package_facts.py && sleep 0' 27844 1726882769.90137: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882769.90152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.90170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.90190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.90229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.90241: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882769.90254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.90274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882769.90288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882769.90300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882769.90311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882769.90327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882769.90342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882769.90353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882769.90363: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882769.90379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882769.90451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882769.90473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882769.90490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882769.90617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882770.36698: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 27844 1726882770.36715: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 27844 1726882770.36721: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 27844 1726882770.36724: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 27844 1726882770.36734: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 27844 1726882770.36740: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 27844 1726882770.36817: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 27844 1726882770.36834: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 27844 1726882770.36839: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 27844 1726882770.36877: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 27844 1726882770.36900: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27844 1726882770.38472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882770.38475: stdout chunk (state=3): >>><<< 27844 1726882770.38477: stderr chunk (state=3): >>><<< 27844 1726882770.38777: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882770.43827: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882770.43978: _low_level_execute_command(): starting 27844 1726882770.43988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882769.7425897-29164-198628690403050/ > /dev/null 2>&1 && sleep 0' 27844 1726882770.45818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882770.45822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882770.45855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882770.45860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882770.45862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882770.46046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882770.46049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882770.46097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882770.46311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882770.48152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882770.48225: stderr chunk (state=3): >>><<< 27844 1726882770.48229: stdout chunk (state=3): >>><<< 27844 1726882770.48472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882770.48476: handler run complete 27844 1726882770.49691: variable 'ansible_facts' from source: unknown 27844 1726882770.50767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.55378: variable 'ansible_facts' from source: unknown 27844 1726882770.56537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.57959: attempt loop complete, returning result 27844 1726882770.57976: _execute() done 27844 1726882770.57979: dumping result to json 27844 1726882770.58215: done dumping result, returning 27844 1726882770.58225: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-efa9-466a-0000000006ad] 27844 1726882770.58229: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006ad ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882770.61550: no more pending results, returning what we have 27844 1726882770.61553: results queue empty 27844 1726882770.61554: checking for any_errors_fatal 27844 1726882770.61560: done checking for any_errors_fatal 27844 1726882770.61561: checking for max_fail_percentage 27844 1726882770.61563: done checking for max_fail_percentage 27844 1726882770.61565: checking to see if all hosts have failed and the running result is not ok 27844 1726882770.61566: done checking to see if all hosts have failed 27844 1726882770.61567: getting the remaining hosts for this loop 27844 1726882770.61568: done getting the remaining hosts for this loop 27844 1726882770.61571: getting the next task for host managed_node1 27844 1726882770.61578: done getting next task for host managed_node1 27844 1726882770.61582: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882770.61586: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882770.61598: getting variables 27844 1726882770.61599: in VariableManager get_vars() 27844 1726882770.61633: Calling all_inventory to load vars for managed_node1 27844 1726882770.61636: Calling groups_inventory to load vars for managed_node1 27844 1726882770.61638: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882770.61647: Calling all_plugins_play to load vars for managed_node1 27844 1726882770.61650: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882770.61652: Calling groups_plugins_play to load vars for managed_node1 27844 1726882770.62763: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000006ad 27844 1726882770.62769: WORKER PROCESS EXITING 27844 1726882770.63613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.65714: done with get_vars() 27844 1726882770.65737: done getting variables 27844 1726882770.65798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:30 -0400 (0:00:00.980) 0:00:29.736 ****** 27844 1726882770.65978: entering _queue_task() for managed_node1/debug 27844 1726882770.66322: worker is 1 (out of 1 available) 27844 1726882770.66335: exiting _queue_task() for managed_node1/debug 27844 1726882770.66347: done queuing things up, now waiting for results queue to drain 27844 1726882770.66348: waiting for pending results... 27844 1726882770.67340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 27844 1726882770.67505: in run() - task 0e448fcc-3ce9-efa9-466a-000000000642 27844 1726882770.67524: variable 'ansible_search_path' from source: unknown 27844 1726882770.67532: variable 'ansible_search_path' from source: unknown 27844 1726882770.67582: calling self._execute() 27844 1726882770.67699: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.67709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.67722: variable 'omit' from source: magic vars 27844 1726882770.68122: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.68140: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882770.68152: variable 'omit' from source: magic vars 27844 1726882770.68222: variable 'omit' from source: magic vars 27844 1726882770.68328: variable 'network_provider' from source: set_fact 27844 1726882770.68354: variable 'omit' from source: magic vars 27844 1726882770.68402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882770.68449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882770.68477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882770.68499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882770.68516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882770.68559: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882770.68571: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.68579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.68698: Set connection var ansible_shell_type to sh 27844 1726882770.68705: Set connection var ansible_connection to ssh 27844 1726882770.68717: Set connection var ansible_pipelining to False 27844 1726882770.68727: Set connection var ansible_timeout to 10 27844 1726882770.68737: Set connection var ansible_shell_executable to /bin/sh 27844 1726882770.68749: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882770.68788: variable 'ansible_shell_executable' from source: unknown 27844 1726882770.68796: variable 'ansible_connection' from source: unknown 27844 1726882770.68803: variable 'ansible_module_compression' from source: unknown 27844 1726882770.68808: variable 'ansible_shell_type' from source: unknown 27844 1726882770.68814: variable 'ansible_shell_executable' from source: unknown 27844 1726882770.68818: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.68824: variable 'ansible_pipelining' from source: unknown 27844 1726882770.68829: variable 'ansible_timeout' from source: unknown 27844 1726882770.68835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.68986: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882770.69008: variable 'omit' from source: magic vars 27844 1726882770.69020: starting attempt loop 27844 1726882770.69028: running the handler 27844 1726882770.69086: handler run complete 27844 1726882770.69117: attempt loop complete, returning result 27844 1726882770.69126: _execute() done 27844 1726882770.69133: dumping result to json 27844 1726882770.69142: done dumping result, returning 27844 1726882770.69156: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-efa9-466a-000000000642] 27844 1726882770.69170: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000642 ok: [managed_node1] => {} MSG: Using network provider: nm 27844 1726882770.69343: no more pending results, returning what we have 27844 1726882770.69348: results queue empty 27844 1726882770.69349: checking for any_errors_fatal 27844 1726882770.69359: done checking for any_errors_fatal 27844 1726882770.69359: checking for max_fail_percentage 27844 1726882770.69361: done checking for max_fail_percentage 27844 1726882770.69362: checking to see if all hosts have failed and the running result is not ok 27844 1726882770.69365: done checking to see if all hosts have failed 27844 1726882770.69366: getting the remaining hosts for this loop 27844 1726882770.69368: done getting the remaining hosts for this loop 27844 1726882770.69373: getting the next task for host managed_node1 27844 1726882770.69380: done getting next task for host managed_node1 27844 1726882770.69385: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882770.69390: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882770.69403: getting variables 27844 1726882770.69405: in VariableManager get_vars() 27844 1726882770.69454: Calling all_inventory to load vars for managed_node1 27844 1726882770.69458: Calling groups_inventory to load vars for managed_node1 27844 1726882770.69461: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882770.69476: Calling all_plugins_play to load vars for managed_node1 27844 1726882770.69480: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882770.69484: Calling groups_plugins_play to load vars for managed_node1 27844 1726882770.70517: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000642 27844 1726882770.70521: WORKER PROCESS EXITING 27844 1726882770.71322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.73189: done with get_vars() 27844 1726882770.73215: done getting variables 27844 1726882770.73282: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:30 -0400 (0:00:00.073) 0:00:29.809 ****** 27844 1726882770.73317: entering _queue_task() for managed_node1/fail 27844 1726882770.73641: worker is 1 (out of 1 available) 27844 1726882770.73653: exiting _queue_task() for managed_node1/fail 27844 1726882770.73668: done queuing things up, now waiting for results queue to drain 27844 1726882770.73670: waiting for pending results... 27844 1726882770.74042: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27844 1726882770.74221: in run() - task 0e448fcc-3ce9-efa9-466a-000000000643 27844 1726882770.74231: variable 'ansible_search_path' from source: unknown 27844 1726882770.74234: variable 'ansible_search_path' from source: unknown 27844 1726882770.74290: calling self._execute() 27844 1726882770.74364: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.74373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.74380: variable 'omit' from source: magic vars 27844 1726882770.74664: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.74680: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882770.74759: variable 'network_state' from source: role '' defaults 27844 1726882770.74768: Evaluated conditional (network_state != {}): False 27844 1726882770.74774: when evaluation is False, skipping this task 27844 1726882770.74778: _execute() done 27844 1726882770.74780: dumping result to json 27844 1726882770.74783: done dumping result, returning 27844 1726882770.74789: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-efa9-466a-000000000643] 27844 1726882770.74794: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000643 27844 1726882770.74880: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000643 27844 1726882770.74884: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882770.74930: no more pending results, returning what we have 27844 1726882770.74935: results queue empty 27844 1726882770.74936: checking for any_errors_fatal 27844 1726882770.74944: done checking for any_errors_fatal 27844 1726882770.74945: checking for max_fail_percentage 27844 1726882770.74947: done checking for max_fail_percentage 27844 1726882770.74947: checking to see if all hosts have failed and the running result is not ok 27844 1726882770.74948: done checking to see if all hosts have failed 27844 1726882770.74949: getting the remaining hosts for this loop 27844 1726882770.74951: done getting the remaining hosts for this loop 27844 1726882770.74954: getting the next task for host managed_node1 27844 1726882770.74960: done getting next task for host managed_node1 27844 1726882770.74966: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882770.74970: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882770.74988: getting variables 27844 1726882770.74989: in VariableManager get_vars() 27844 1726882770.75026: Calling all_inventory to load vars for managed_node1 27844 1726882770.75028: Calling groups_inventory to load vars for managed_node1 27844 1726882770.75030: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882770.75039: Calling all_plugins_play to load vars for managed_node1 27844 1726882770.75041: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882770.75043: Calling groups_plugins_play to load vars for managed_node1 27844 1726882770.75937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.81183: done with get_vars() 27844 1726882770.81208: done getting variables 27844 1726882770.81260: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:30 -0400 (0:00:00.079) 0:00:29.889 ****** 27844 1726882770.81284: entering _queue_task() for managed_node1/fail 27844 1726882770.81533: worker is 1 (out of 1 available) 27844 1726882770.81546: exiting _queue_task() for managed_node1/fail 27844 1726882770.81558: done queuing things up, now waiting for results queue to drain 27844 1726882770.81560: waiting for pending results... 27844 1726882770.81749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27844 1726882770.81850: in run() - task 0e448fcc-3ce9-efa9-466a-000000000644 27844 1726882770.81859: variable 'ansible_search_path' from source: unknown 27844 1726882770.81864: variable 'ansible_search_path' from source: unknown 27844 1726882770.81901: calling self._execute() 27844 1726882770.81976: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.81981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.81990: variable 'omit' from source: magic vars 27844 1726882770.82381: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.82400: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882770.82539: variable 'network_state' from source: role '' defaults 27844 1726882770.82562: Evaluated conditional (network_state != {}): False 27844 1726882770.82576: when evaluation is False, skipping this task 27844 1726882770.82582: _execute() done 27844 1726882770.82587: dumping result to json 27844 1726882770.82594: done dumping result, returning 27844 1726882770.82603: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-efa9-466a-000000000644] 27844 1726882770.82612: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000644 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882770.82775: no more pending results, returning what we have 27844 1726882770.82779: results queue empty 27844 1726882770.82780: checking for any_errors_fatal 27844 1726882770.82789: done checking for any_errors_fatal 27844 1726882770.82789: checking for max_fail_percentage 27844 1726882770.82791: done checking for max_fail_percentage 27844 1726882770.82792: checking to see if all hosts have failed and the running result is not ok 27844 1726882770.82793: done checking to see if all hosts have failed 27844 1726882770.82794: getting the remaining hosts for this loop 27844 1726882770.82795: done getting the remaining hosts for this loop 27844 1726882770.82799: getting the next task for host managed_node1 27844 1726882770.82806: done getting next task for host managed_node1 27844 1726882770.82810: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882770.82814: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882770.82844: getting variables 27844 1726882770.82847: in VariableManager get_vars() 27844 1726882770.82895: Calling all_inventory to load vars for managed_node1 27844 1726882770.82899: Calling groups_inventory to load vars for managed_node1 27844 1726882770.82901: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882770.82914: Calling all_plugins_play to load vars for managed_node1 27844 1726882770.82918: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882770.82921: Calling groups_plugins_play to load vars for managed_node1 27844 1726882770.83853: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000644 27844 1726882770.83857: WORKER PROCESS EXITING 27844 1726882770.84624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.87495: done with get_vars() 27844 1726882770.87521: done getting variables 27844 1726882770.87581: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:30 -0400 (0:00:00.063) 0:00:29.952 ****** 27844 1726882770.87616: entering _queue_task() for managed_node1/fail 27844 1726882770.87944: worker is 1 (out of 1 available) 27844 1726882770.87956: exiting _queue_task() for managed_node1/fail 27844 1726882770.87970: done queuing things up, now waiting for results queue to drain 27844 1726882770.87972: waiting for pending results... 27844 1726882770.88253: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27844 1726882770.88378: in run() - task 0e448fcc-3ce9-efa9-466a-000000000645 27844 1726882770.88391: variable 'ansible_search_path' from source: unknown 27844 1726882770.88396: variable 'ansible_search_path' from source: unknown 27844 1726882770.88434: calling self._execute() 27844 1726882770.88534: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.88540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.88551: variable 'omit' from source: magic vars 27844 1726882770.88907: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.88919: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882770.89092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882770.92169: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882770.92230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882770.92276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882770.92314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882770.92338: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882770.92419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882770.92447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882770.92473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882770.92516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882770.92529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882770.92693: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.92707: Evaluated conditional (ansible_distribution_major_version | int > 9): False 27844 1726882770.92710: when evaluation is False, skipping this task 27844 1726882770.92713: _execute() done 27844 1726882770.92715: dumping result to json 27844 1726882770.92717: done dumping result, returning 27844 1726882770.92729: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-efa9-466a-000000000645] 27844 1726882770.92735: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000645 27844 1726882770.92829: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000645 27844 1726882770.92832: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 27844 1726882770.92882: no more pending results, returning what we have 27844 1726882770.92886: results queue empty 27844 1726882770.92887: checking for any_errors_fatal 27844 1726882770.92895: done checking for any_errors_fatal 27844 1726882770.92895: checking for max_fail_percentage 27844 1726882770.92897: done checking for max_fail_percentage 27844 1726882770.92898: checking to see if all hosts have failed and the running result is not ok 27844 1726882770.92899: done checking to see if all hosts have failed 27844 1726882770.92900: getting the remaining hosts for this loop 27844 1726882770.92901: done getting the remaining hosts for this loop 27844 1726882770.92905: getting the next task for host managed_node1 27844 1726882770.92913: done getting next task for host managed_node1 27844 1726882770.92917: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882770.92921: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882770.92942: getting variables 27844 1726882770.92944: in VariableManager get_vars() 27844 1726882770.92989: Calling all_inventory to load vars for managed_node1 27844 1726882770.92992: Calling groups_inventory to load vars for managed_node1 27844 1726882770.92995: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882770.93005: Calling all_plugins_play to load vars for managed_node1 27844 1726882770.93009: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882770.93011: Calling groups_plugins_play to load vars for managed_node1 27844 1726882770.94979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882770.97895: done with get_vars() 27844 1726882770.97918: done getting variables 27844 1726882770.97977: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:30 -0400 (0:00:00.103) 0:00:30.056 ****** 27844 1726882770.98010: entering _queue_task() for managed_node1/dnf 27844 1726882770.98732: worker is 1 (out of 1 available) 27844 1726882770.98743: exiting _queue_task() for managed_node1/dnf 27844 1726882770.98755: done queuing things up, now waiting for results queue to drain 27844 1726882770.98756: waiting for pending results... 27844 1726882770.99193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27844 1726882770.99325: in run() - task 0e448fcc-3ce9-efa9-466a-000000000646 27844 1726882770.99338: variable 'ansible_search_path' from source: unknown 27844 1726882770.99342: variable 'ansible_search_path' from source: unknown 27844 1726882770.99379: calling self._execute() 27844 1726882770.99477: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882770.99481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882770.99492: variable 'omit' from source: magic vars 27844 1726882770.99860: variable 'ansible_distribution_major_version' from source: facts 27844 1726882770.99874: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.00062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.02403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.02478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.02512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.02546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.02971: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.02975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.02978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.02980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.02982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.02984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.02986: variable 'ansible_distribution' from source: facts 27844 1726882771.02988: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.02990: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27844 1726882771.02992: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.03131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.03153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.03184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.03227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.03243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.03280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.03302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.03330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.03371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.03385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.03425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.03456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.03484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.03524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.03538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.03699: variable 'network_connections' from source: include params 27844 1726882771.03710: variable 'interface0' from source: play vars 27844 1726882771.03784: variable 'interface0' from source: play vars 27844 1726882771.03792: variable 'interface1' from source: play vars 27844 1726882771.03851: variable 'interface1' from source: play vars 27844 1726882771.03921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882771.04100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882771.04135: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882771.04168: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882771.04197: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882771.04236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882771.04257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882771.04284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.04315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882771.04359: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882771.04602: variable 'network_connections' from source: include params 27844 1726882771.04606: variable 'interface0' from source: play vars 27844 1726882771.04671: variable 'interface0' from source: play vars 27844 1726882771.04679: variable 'interface1' from source: play vars 27844 1726882771.04741: variable 'interface1' from source: play vars 27844 1726882771.04763: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882771.04770: when evaluation is False, skipping this task 27844 1726882771.04772: _execute() done 27844 1726882771.04775: dumping result to json 27844 1726882771.04777: done dumping result, returning 27844 1726882771.04781: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000646] 27844 1726882771.04786: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000646 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882771.04931: no more pending results, returning what we have 27844 1726882771.04935: results queue empty 27844 1726882771.04936: checking for any_errors_fatal 27844 1726882771.04944: done checking for any_errors_fatal 27844 1726882771.04944: checking for max_fail_percentage 27844 1726882771.04946: done checking for max_fail_percentage 27844 1726882771.04947: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.04948: done checking to see if all hosts have failed 27844 1726882771.04949: getting the remaining hosts for this loop 27844 1726882771.04950: done getting the remaining hosts for this loop 27844 1726882771.04954: getting the next task for host managed_node1 27844 1726882771.04961: done getting next task for host managed_node1 27844 1726882771.04967: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882771.04971: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.04995: getting variables 27844 1726882771.04997: in VariableManager get_vars() 27844 1726882771.05038: Calling all_inventory to load vars for managed_node1 27844 1726882771.05041: Calling groups_inventory to load vars for managed_node1 27844 1726882771.05044: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.05055: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.05058: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.05062: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.05800: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000646 27844 1726882771.05804: WORKER PROCESS EXITING 27844 1726882771.06722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.08689: done with get_vars() 27844 1726882771.08715: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27844 1726882771.08792: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:31 -0400 (0:00:00.108) 0:00:30.164 ****** 27844 1726882771.08828: entering _queue_task() for managed_node1/yum 27844 1726882771.09154: worker is 1 (out of 1 available) 27844 1726882771.09171: exiting _queue_task() for managed_node1/yum 27844 1726882771.09186: done queuing things up, now waiting for results queue to drain 27844 1726882771.09188: waiting for pending results... 27844 1726882771.09497: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27844 1726882771.09596: in run() - task 0e448fcc-3ce9-efa9-466a-000000000647 27844 1726882771.09608: variable 'ansible_search_path' from source: unknown 27844 1726882771.09612: variable 'ansible_search_path' from source: unknown 27844 1726882771.09642: calling self._execute() 27844 1726882771.09729: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.09733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.09744: variable 'omit' from source: magic vars 27844 1726882771.10024: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.10036: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.10152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.11899: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.11970: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.12002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.12035: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.12059: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.12134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.12160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.12189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.12231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.12251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.12330: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.12352: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27844 1726882771.12355: when evaluation is False, skipping this task 27844 1726882771.12358: _execute() done 27844 1726882771.12361: dumping result to json 27844 1726882771.12368: done dumping result, returning 27844 1726882771.12371: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000647] 27844 1726882771.12374: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000647 27844 1726882771.12461: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000647 27844 1726882771.12467: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27844 1726882771.12516: no more pending results, returning what we have 27844 1726882771.12520: results queue empty 27844 1726882771.12521: checking for any_errors_fatal 27844 1726882771.12527: done checking for any_errors_fatal 27844 1726882771.12528: checking for max_fail_percentage 27844 1726882771.12529: done checking for max_fail_percentage 27844 1726882771.12530: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.12531: done checking to see if all hosts have failed 27844 1726882771.12531: getting the remaining hosts for this loop 27844 1726882771.12533: done getting the remaining hosts for this loop 27844 1726882771.12536: getting the next task for host managed_node1 27844 1726882771.12542: done getting next task for host managed_node1 27844 1726882771.12546: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882771.12550: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.12574: getting variables 27844 1726882771.12577: in VariableManager get_vars() 27844 1726882771.12613: Calling all_inventory to load vars for managed_node1 27844 1726882771.12615: Calling groups_inventory to load vars for managed_node1 27844 1726882771.12617: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.12626: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.12628: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.12631: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.13536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.14494: done with get_vars() 27844 1726882771.14511: done getting variables 27844 1726882771.14550: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:31 -0400 (0:00:00.057) 0:00:30.222 ****** 27844 1726882771.14577: entering _queue_task() for managed_node1/fail 27844 1726882771.14782: worker is 1 (out of 1 available) 27844 1726882771.14795: exiting _queue_task() for managed_node1/fail 27844 1726882771.15498: done queuing things up, now waiting for results queue to drain 27844 1726882771.15500: waiting for pending results... 27844 1726882771.15518: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27844 1726882771.15522: in run() - task 0e448fcc-3ce9-efa9-466a-000000000648 27844 1726882771.15525: variable 'ansible_search_path' from source: unknown 27844 1726882771.15529: variable 'ansible_search_path' from source: unknown 27844 1726882771.15531: calling self._execute() 27844 1726882771.15534: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.15537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.15539: variable 'omit' from source: magic vars 27844 1726882771.16052: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.16071: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.16480: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.16695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.18814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.19191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.19220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.19254: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.19279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.19347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.19377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.19397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.19432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.19445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.19491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.19512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.19533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.19588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.19602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.19650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.19679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.19695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.19733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.19757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.19931: variable 'network_connections' from source: include params 27844 1726882771.19940: variable 'interface0' from source: play vars 27844 1726882771.20558: variable 'interface0' from source: play vars 27844 1726882771.20561: variable 'interface1' from source: play vars 27844 1726882771.20569: variable 'interface1' from source: play vars 27844 1726882771.20572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882771.20574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882771.20577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882771.20579: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882771.20581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882771.20583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882771.20589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882771.20591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.20593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882771.20595: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882771.21377: variable 'network_connections' from source: include params 27844 1726882771.21380: variable 'interface0' from source: play vars 27844 1726882771.21383: variable 'interface0' from source: play vars 27844 1726882771.21385: variable 'interface1' from source: play vars 27844 1726882771.21387: variable 'interface1' from source: play vars 27844 1726882771.21389: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882771.21390: when evaluation is False, skipping this task 27844 1726882771.21392: _execute() done 27844 1726882771.21394: dumping result to json 27844 1726882771.21396: done dumping result, returning 27844 1726882771.21399: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-000000000648] 27844 1726882771.21407: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000648 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882771.21518: no more pending results, returning what we have 27844 1726882771.21521: results queue empty 27844 1726882771.21522: checking for any_errors_fatal 27844 1726882771.21527: done checking for any_errors_fatal 27844 1726882771.21528: checking for max_fail_percentage 27844 1726882771.21530: done checking for max_fail_percentage 27844 1726882771.21531: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.21531: done checking to see if all hosts have failed 27844 1726882771.21532: getting the remaining hosts for this loop 27844 1726882771.21534: done getting the remaining hosts for this loop 27844 1726882771.21537: getting the next task for host managed_node1 27844 1726882771.21542: done getting next task for host managed_node1 27844 1726882771.21546: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27844 1726882771.21551: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.21571: getting variables 27844 1726882771.21573: in VariableManager get_vars() 27844 1726882771.21612: Calling all_inventory to load vars for managed_node1 27844 1726882771.21615: Calling groups_inventory to load vars for managed_node1 27844 1726882771.21617: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.21626: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.21628: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.21631: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.22803: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000648 27844 1726882771.22806: WORKER PROCESS EXITING 27844 1726882771.24108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.26007: done with get_vars() 27844 1726882771.26029: done getting variables 27844 1726882771.26101: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:31 -0400 (0:00:00.115) 0:00:30.337 ****** 27844 1726882771.26137: entering _queue_task() for managed_node1/package 27844 1726882771.26470: worker is 1 (out of 1 available) 27844 1726882771.26485: exiting _queue_task() for managed_node1/package 27844 1726882771.26500: done queuing things up, now waiting for results queue to drain 27844 1726882771.26502: waiting for pending results... 27844 1726882771.26841: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 27844 1726882771.27133: in run() - task 0e448fcc-3ce9-efa9-466a-000000000649 27844 1726882771.27154: variable 'ansible_search_path' from source: unknown 27844 1726882771.27163: variable 'ansible_search_path' from source: unknown 27844 1726882771.27232: calling self._execute() 27844 1726882771.27349: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.27362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.27384: variable 'omit' from source: magic vars 27844 1726882771.28029: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.28049: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.28386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882771.29115: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882771.29174: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882771.29228: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882771.29320: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882771.29448: variable 'network_packages' from source: role '' defaults 27844 1726882771.29575: variable '__network_provider_setup' from source: role '' defaults 27844 1726882771.29591: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882771.29676: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882771.29689: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882771.29760: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882771.29972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.32195: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.32269: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.32310: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.32353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.32387: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.32491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.32525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.32569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.32615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.32638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.32697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.32725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.32759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.32826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.32849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.33014: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882771.33090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.33106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.33131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.33155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.33166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.33231: variable 'ansible_python' from source: facts 27844 1726882771.33248: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882771.33310: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882771.33362: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882771.33447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.33469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.33488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.33516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.33527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.33558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.33582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.33599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.33626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.33638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.33735: variable 'network_connections' from source: include params 27844 1726882771.33742: variable 'interface0' from source: play vars 27844 1726882771.33814: variable 'interface0' from source: play vars 27844 1726882771.33822: variable 'interface1' from source: play vars 27844 1726882771.33896: variable 'interface1' from source: play vars 27844 1726882771.33943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882771.33964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882771.33989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.34009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882771.34049: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.34228: variable 'network_connections' from source: include params 27844 1726882771.34231: variable 'interface0' from source: play vars 27844 1726882771.34306: variable 'interface0' from source: play vars 27844 1726882771.34314: variable 'interface1' from source: play vars 27844 1726882771.34385: variable 'interface1' from source: play vars 27844 1726882771.34408: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882771.34460: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.34713: variable 'network_connections' from source: include params 27844 1726882771.34725: variable 'interface0' from source: play vars 27844 1726882771.34824: variable 'interface0' from source: play vars 27844 1726882771.34839: variable 'interface1' from source: play vars 27844 1726882771.34907: variable 'interface1' from source: play vars 27844 1726882771.34946: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882771.35039: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882771.35356: variable 'network_connections' from source: include params 27844 1726882771.35376: variable 'interface0' from source: play vars 27844 1726882771.35474: variable 'interface0' from source: play vars 27844 1726882771.35504: variable 'interface1' from source: play vars 27844 1726882771.35627: variable 'interface1' from source: play vars 27844 1726882771.35714: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882771.35756: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882771.35762: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882771.35829: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882771.35976: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882771.36276: variable 'network_connections' from source: include params 27844 1726882771.36279: variable 'interface0' from source: play vars 27844 1726882771.36321: variable 'interface0' from source: play vars 27844 1726882771.36327: variable 'interface1' from source: play vars 27844 1726882771.36374: variable 'interface1' from source: play vars 27844 1726882771.36381: variable 'ansible_distribution' from source: facts 27844 1726882771.36384: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.36390: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.36401: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882771.36509: variable 'ansible_distribution' from source: facts 27844 1726882771.36512: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.36515: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.36527: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882771.36645: variable 'ansible_distribution' from source: facts 27844 1726882771.36648: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.36653: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.36686: variable 'network_provider' from source: set_fact 27844 1726882771.36698: variable 'ansible_facts' from source: unknown 27844 1726882771.37153: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27844 1726882771.37156: when evaluation is False, skipping this task 27844 1726882771.37159: _execute() done 27844 1726882771.37161: dumping result to json 27844 1726882771.37164: done dumping result, returning 27844 1726882771.37174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-efa9-466a-000000000649] 27844 1726882771.37178: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000649 27844 1726882771.37269: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000649 27844 1726882771.37272: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27844 1726882771.37320: no more pending results, returning what we have 27844 1726882771.37324: results queue empty 27844 1726882771.37325: checking for any_errors_fatal 27844 1726882771.37333: done checking for any_errors_fatal 27844 1726882771.37334: checking for max_fail_percentage 27844 1726882771.37335: done checking for max_fail_percentage 27844 1726882771.37336: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.37337: done checking to see if all hosts have failed 27844 1726882771.37337: getting the remaining hosts for this loop 27844 1726882771.37339: done getting the remaining hosts for this loop 27844 1726882771.37342: getting the next task for host managed_node1 27844 1726882771.37349: done getting next task for host managed_node1 27844 1726882771.37353: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882771.37357: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.37384: getting variables 27844 1726882771.37386: in VariableManager get_vars() 27844 1726882771.37426: Calling all_inventory to load vars for managed_node1 27844 1726882771.37429: Calling groups_inventory to load vars for managed_node1 27844 1726882771.37431: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.37441: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.37444: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.37446: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.38771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.40109: done with get_vars() 27844 1726882771.40125: done getting variables 27844 1726882771.40170: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:31 -0400 (0:00:00.140) 0:00:30.478 ****** 27844 1726882771.40195: entering _queue_task() for managed_node1/package 27844 1726882771.40408: worker is 1 (out of 1 available) 27844 1726882771.40421: exiting _queue_task() for managed_node1/package 27844 1726882771.40433: done queuing things up, now waiting for results queue to drain 27844 1726882771.40435: waiting for pending results... 27844 1726882771.40629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27844 1726882771.40722: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064a 27844 1726882771.40732: variable 'ansible_search_path' from source: unknown 27844 1726882771.40736: variable 'ansible_search_path' from source: unknown 27844 1726882771.40771: calling self._execute() 27844 1726882771.40850: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.40854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.40863: variable 'omit' from source: magic vars 27844 1726882771.41140: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.41150: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.41234: variable 'network_state' from source: role '' defaults 27844 1726882771.41241: Evaluated conditional (network_state != {}): False 27844 1726882771.41244: when evaluation is False, skipping this task 27844 1726882771.41247: _execute() done 27844 1726882771.41250: dumping result to json 27844 1726882771.41252: done dumping result, returning 27844 1726882771.41258: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-00000000064a] 27844 1726882771.41268: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064a 27844 1726882771.41353: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064a 27844 1726882771.41356: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882771.41418: no more pending results, returning what we have 27844 1726882771.41421: results queue empty 27844 1726882771.41422: checking for any_errors_fatal 27844 1726882771.41428: done checking for any_errors_fatal 27844 1726882771.41429: checking for max_fail_percentage 27844 1726882771.41431: done checking for max_fail_percentage 27844 1726882771.41432: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.41432: done checking to see if all hosts have failed 27844 1726882771.41433: getting the remaining hosts for this loop 27844 1726882771.41435: done getting the remaining hosts for this loop 27844 1726882771.41438: getting the next task for host managed_node1 27844 1726882771.41444: done getting next task for host managed_node1 27844 1726882771.41448: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882771.41452: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.41489: getting variables 27844 1726882771.41491: in VariableManager get_vars() 27844 1726882771.41529: Calling all_inventory to load vars for managed_node1 27844 1726882771.41531: Calling groups_inventory to load vars for managed_node1 27844 1726882771.41534: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.41542: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.41545: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.41547: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.42855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.44368: done with get_vars() 27844 1726882771.44386: done getting variables 27844 1726882771.44426: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:31 -0400 (0:00:00.042) 0:00:30.521 ****** 27844 1726882771.44447: entering _queue_task() for managed_node1/package 27844 1726882771.44639: worker is 1 (out of 1 available) 27844 1726882771.44652: exiting _queue_task() for managed_node1/package 27844 1726882771.44668: done queuing things up, now waiting for results queue to drain 27844 1726882771.44670: waiting for pending results... 27844 1726882771.44835: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27844 1726882771.44922: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064b 27844 1726882771.44931: variable 'ansible_search_path' from source: unknown 27844 1726882771.44935: variable 'ansible_search_path' from source: unknown 27844 1726882771.44965: calling self._execute() 27844 1726882771.45039: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.45043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.45051: variable 'omit' from source: magic vars 27844 1726882771.45317: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.45327: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.45410: variable 'network_state' from source: role '' defaults 27844 1726882771.45418: Evaluated conditional (network_state != {}): False 27844 1726882771.45421: when evaluation is False, skipping this task 27844 1726882771.45423: _execute() done 27844 1726882771.45426: dumping result to json 27844 1726882771.45428: done dumping result, returning 27844 1726882771.45436: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-efa9-466a-00000000064b] 27844 1726882771.45443: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064b 27844 1726882771.45534: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064b 27844 1726882771.45537: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882771.45597: no more pending results, returning what we have 27844 1726882771.45600: results queue empty 27844 1726882771.45601: checking for any_errors_fatal 27844 1726882771.45607: done checking for any_errors_fatal 27844 1726882771.45608: checking for max_fail_percentage 27844 1726882771.45609: done checking for max_fail_percentage 27844 1726882771.45610: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.45611: done checking to see if all hosts have failed 27844 1726882771.45611: getting the remaining hosts for this loop 27844 1726882771.45613: done getting the remaining hosts for this loop 27844 1726882771.45615: getting the next task for host managed_node1 27844 1726882771.45621: done getting next task for host managed_node1 27844 1726882771.45624: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882771.45628: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.45643: getting variables 27844 1726882771.45645: in VariableManager get_vars() 27844 1726882771.45682: Calling all_inventory to load vars for managed_node1 27844 1726882771.45684: Calling groups_inventory to load vars for managed_node1 27844 1726882771.45685: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.45691: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.45693: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.45695: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.46877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.48022: done with get_vars() 27844 1726882771.48037: done getting variables 27844 1726882771.48084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:31 -0400 (0:00:00.036) 0:00:30.557 ****** 27844 1726882771.48106: entering _queue_task() for managed_node1/service 27844 1726882771.48304: worker is 1 (out of 1 available) 27844 1726882771.48317: exiting _queue_task() for managed_node1/service 27844 1726882771.48330: done queuing things up, now waiting for results queue to drain 27844 1726882771.48332: waiting for pending results... 27844 1726882771.48501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27844 1726882771.48582: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064c 27844 1726882771.48591: variable 'ansible_search_path' from source: unknown 27844 1726882771.48595: variable 'ansible_search_path' from source: unknown 27844 1726882771.48623: calling self._execute() 27844 1726882771.48708: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.48712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.48720: variable 'omit' from source: magic vars 27844 1726882771.48994: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.49006: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.49089: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.49220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.50802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.50894: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.50899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.50930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.50948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.51010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.51033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.51050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.51080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.51092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.51124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.51141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.51157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.51189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.51197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.51226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.51243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.51260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.51287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.51298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.51406: variable 'network_connections' from source: include params 27844 1726882771.51416: variable 'interface0' from source: play vars 27844 1726882771.51471: variable 'interface0' from source: play vars 27844 1726882771.51479: variable 'interface1' from source: play vars 27844 1726882771.51522: variable 'interface1' from source: play vars 27844 1726882771.51573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882771.51685: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882771.51712: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882771.51735: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882771.51757: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882771.51791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882771.51807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882771.51824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.51843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882771.51884: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882771.52034: variable 'network_connections' from source: include params 27844 1726882771.52037: variable 'interface0' from source: play vars 27844 1726882771.52082: variable 'interface0' from source: play vars 27844 1726882771.52089: variable 'interface1' from source: play vars 27844 1726882771.52131: variable 'interface1' from source: play vars 27844 1726882771.52147: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27844 1726882771.52150: when evaluation is False, skipping this task 27844 1726882771.52153: _execute() done 27844 1726882771.52156: dumping result to json 27844 1726882771.52158: done dumping result, returning 27844 1726882771.52165: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-efa9-466a-00000000064c] 27844 1726882771.52179: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064c 27844 1726882771.52254: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064c 27844 1726882771.52257: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27844 1726882771.52334: no more pending results, returning what we have 27844 1726882771.52337: results queue empty 27844 1726882771.52338: checking for any_errors_fatal 27844 1726882771.52344: done checking for any_errors_fatal 27844 1726882771.52345: checking for max_fail_percentage 27844 1726882771.52346: done checking for max_fail_percentage 27844 1726882771.52347: checking to see if all hosts have failed and the running result is not ok 27844 1726882771.52348: done checking to see if all hosts have failed 27844 1726882771.52348: getting the remaining hosts for this loop 27844 1726882771.52349: done getting the remaining hosts for this loop 27844 1726882771.52353: getting the next task for host managed_node1 27844 1726882771.52357: done getting next task for host managed_node1 27844 1726882771.52361: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882771.52369: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882771.52389: getting variables 27844 1726882771.52391: in VariableManager get_vars() 27844 1726882771.52424: Calling all_inventory to load vars for managed_node1 27844 1726882771.52426: Calling groups_inventory to load vars for managed_node1 27844 1726882771.52430: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882771.52437: Calling all_plugins_play to load vars for managed_node1 27844 1726882771.52438: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882771.52440: Calling groups_plugins_play to load vars for managed_node1 27844 1726882771.53216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882771.54141: done with get_vars() 27844 1726882771.54156: done getting variables 27844 1726882771.54197: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:31 -0400 (0:00:00.061) 0:00:30.618 ****** 27844 1726882771.54221: entering _queue_task() for managed_node1/service 27844 1726882771.54398: worker is 1 (out of 1 available) 27844 1726882771.54409: exiting _queue_task() for managed_node1/service 27844 1726882771.54422: done queuing things up, now waiting for results queue to drain 27844 1726882771.54424: waiting for pending results... 27844 1726882771.54609: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27844 1726882771.54694: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064d 27844 1726882771.54704: variable 'ansible_search_path' from source: unknown 27844 1726882771.54708: variable 'ansible_search_path' from source: unknown 27844 1726882771.54738: calling self._execute() 27844 1726882771.54817: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.54820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.54830: variable 'omit' from source: magic vars 27844 1726882771.55109: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.55119: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882771.55228: variable 'network_provider' from source: set_fact 27844 1726882771.55231: variable 'network_state' from source: role '' defaults 27844 1726882771.55240: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27844 1726882771.55246: variable 'omit' from source: magic vars 27844 1726882771.55286: variable 'omit' from source: magic vars 27844 1726882771.55309: variable 'network_service_name' from source: role '' defaults 27844 1726882771.55361: variable 'network_service_name' from source: role '' defaults 27844 1726882771.55434: variable '__network_provider_setup' from source: role '' defaults 27844 1726882771.55438: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882771.55489: variable '__network_service_name_default_nm' from source: role '' defaults 27844 1726882771.55493: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882771.55539: variable '__network_packages_default_nm' from source: role '' defaults 27844 1726882771.55686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882771.57210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882771.57496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882771.57523: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882771.57548: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882771.57573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882771.57626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.57646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.57663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.57698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.57708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.57738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.57753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.57775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.57803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.57814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.57953: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27844 1726882771.58031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.58048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.58066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.58094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.58108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.58167: variable 'ansible_python' from source: facts 27844 1726882771.58184: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27844 1726882771.58240: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882771.58296: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882771.58382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.58399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.58416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.58444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.58455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.58491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882771.58510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882771.58526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.58556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882771.58567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882771.58657: variable 'network_connections' from source: include params 27844 1726882771.58665: variable 'interface0' from source: play vars 27844 1726882771.58719: variable 'interface0' from source: play vars 27844 1726882771.58729: variable 'interface1' from source: play vars 27844 1726882771.58785: variable 'interface1' from source: play vars 27844 1726882771.58852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882771.58970: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882771.59016: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882771.59046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882771.59081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882771.59126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882771.59146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882771.59171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882771.59200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882771.59233: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.59409: variable 'network_connections' from source: include params 27844 1726882771.59417: variable 'interface0' from source: play vars 27844 1726882771.59468: variable 'interface0' from source: play vars 27844 1726882771.59480: variable 'interface1' from source: play vars 27844 1726882771.59532: variable 'interface1' from source: play vars 27844 1726882771.59556: variable '__network_packages_default_wireless' from source: role '' defaults 27844 1726882771.59612: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882771.59805: variable 'network_connections' from source: include params 27844 1726882771.59808: variable 'interface0' from source: play vars 27844 1726882771.59860: variable 'interface0' from source: play vars 27844 1726882771.59866: variable 'interface1' from source: play vars 27844 1726882771.59918: variable 'interface1' from source: play vars 27844 1726882771.59934: variable '__network_packages_default_team' from source: role '' defaults 27844 1726882771.59993: variable '__network_team_connections_defined' from source: role '' defaults 27844 1726882771.60178: variable 'network_connections' from source: include params 27844 1726882771.60181: variable 'interface0' from source: play vars 27844 1726882771.60231: variable 'interface0' from source: play vars 27844 1726882771.60237: variable 'interface1' from source: play vars 27844 1726882771.60291: variable 'interface1' from source: play vars 27844 1726882771.60327: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882771.60368: variable '__network_service_name_default_initscripts' from source: role '' defaults 27844 1726882771.60377: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882771.60421: variable '__network_packages_default_initscripts' from source: role '' defaults 27844 1726882771.60556: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27844 1726882771.60870: variable 'network_connections' from source: include params 27844 1726882771.60876: variable 'interface0' from source: play vars 27844 1726882771.60917: variable 'interface0' from source: play vars 27844 1726882771.60923: variable 'interface1' from source: play vars 27844 1726882771.60968: variable 'interface1' from source: play vars 27844 1726882771.60977: variable 'ansible_distribution' from source: facts 27844 1726882771.60980: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.60986: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.60997: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27844 1726882771.61112: variable 'ansible_distribution' from source: facts 27844 1726882771.61116: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.61120: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.61130: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27844 1726882771.61242: variable 'ansible_distribution' from source: facts 27844 1726882771.61246: variable '__network_rh_distros' from source: role '' defaults 27844 1726882771.61249: variable 'ansible_distribution_major_version' from source: facts 27844 1726882771.61280: variable 'network_provider' from source: set_fact 27844 1726882771.61295: variable 'omit' from source: magic vars 27844 1726882771.61314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882771.61334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882771.61347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882771.61364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882771.61376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882771.61395: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882771.61398: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.61400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.61462: Set connection var ansible_shell_type to sh 27844 1726882771.61470: Set connection var ansible_connection to ssh 27844 1726882771.61473: Set connection var ansible_pipelining to False 27844 1726882771.61483: Set connection var ansible_timeout to 10 27844 1726882771.61486: Set connection var ansible_shell_executable to /bin/sh 27844 1726882771.61491: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882771.61510: variable 'ansible_shell_executable' from source: unknown 27844 1726882771.61513: variable 'ansible_connection' from source: unknown 27844 1726882771.61516: variable 'ansible_module_compression' from source: unknown 27844 1726882771.61518: variable 'ansible_shell_type' from source: unknown 27844 1726882771.61520: variable 'ansible_shell_executable' from source: unknown 27844 1726882771.61526: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882771.61528: variable 'ansible_pipelining' from source: unknown 27844 1726882771.61530: variable 'ansible_timeout' from source: unknown 27844 1726882771.61532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882771.61604: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882771.61612: variable 'omit' from source: magic vars 27844 1726882771.61617: starting attempt loop 27844 1726882771.61620: running the handler 27844 1726882771.61676: variable 'ansible_facts' from source: unknown 27844 1726882771.62084: _low_level_execute_command(): starting 27844 1726882771.62089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882771.62590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882771.62598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882771.62628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.62641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.62701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882771.62707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882771.62719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882771.62829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882771.64487: stdout chunk (state=3): >>>/root <<< 27844 1726882771.64589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882771.64639: stderr chunk (state=3): >>><<< 27844 1726882771.64642: stdout chunk (state=3): >>><<< 27844 1726882771.64658: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882771.64670: _low_level_execute_command(): starting 27844 1726882771.64674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584 `" && echo ansible-tmp-1726882771.6465733-29247-56098701066584="` echo /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584 `" ) && sleep 0' 27844 1726882771.65100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882771.65106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882771.65136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.65148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.65202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882771.65214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882771.65314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882771.67197: stdout chunk (state=3): >>>ansible-tmp-1726882771.6465733-29247-56098701066584=/root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584 <<< 27844 1726882771.67306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882771.67345: stderr chunk (state=3): >>><<< 27844 1726882771.67348: stdout chunk (state=3): >>><<< 27844 1726882771.67360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882771.6465733-29247-56098701066584=/root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882771.67387: variable 'ansible_module_compression' from source: unknown 27844 1726882771.67430: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27844 1726882771.67484: variable 'ansible_facts' from source: unknown 27844 1726882771.67625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/AnsiballZ_systemd.py 27844 1726882771.67725: Sending initial data 27844 1726882771.67730: Sent initial data (155 bytes) 27844 1726882771.68369: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882771.68384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882771.68413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.68425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.68477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882771.68489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882771.68596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882771.70310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882771.70397: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882771.70494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpspew88ep /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/AnsiballZ_systemd.py <<< 27844 1726882771.70586: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882771.72530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882771.72615: stderr chunk (state=3): >>><<< 27844 1726882771.72619: stdout chunk (state=3): >>><<< 27844 1726882771.72632: done transferring module to remote 27844 1726882771.72640: _low_level_execute_command(): starting 27844 1726882771.72645: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/ /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/AnsiballZ_systemd.py && sleep 0' 27844 1726882771.73052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882771.73057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882771.73092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.73103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.73154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882771.73170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882771.73269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882771.74987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882771.75026: stderr chunk (state=3): >>><<< 27844 1726882771.75029: stdout chunk (state=3): >>><<< 27844 1726882771.75039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882771.75046: _low_level_execute_command(): starting 27844 1726882771.75051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/AnsiballZ_systemd.py && sleep 0' 27844 1726882771.75447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882771.75453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882771.75485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882771.75497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882771.75509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882771.75560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882771.75571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882771.75679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.00524: stdout chunk (state=3): >>> <<< 27844 1726882772.00571: stdout chunk (state=3): >>>{"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16244736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1368776000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0<<< 27844 1726882772.00593: stdout chunk (state=3): >>>", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27844 1726882772.02103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882772.02106: stdout chunk (state=3): >>><<< 27844 1726882772.02109: stderr chunk (state=3): >>><<< 27844 1726882772.02174: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16244736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1368776000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882772.02372: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882772.02376: _low_level_execute_command(): starting 27844 1726882772.02379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882771.6465733-29247-56098701066584/ > /dev/null 2>&1 && sleep 0' 27844 1726882772.04027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882772.04103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.04113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.04154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.04193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.04257: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882772.04271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.04284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882772.04292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882772.04298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882772.04308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.04320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.04332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.04338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.04345: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882772.04356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.04424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882772.04516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.04528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.04670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.06971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882772.06974: stdout chunk (state=3): >>><<< 27844 1726882772.06976: stderr chunk (state=3): >>><<< 27844 1726882772.06978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882772.06980: handler run complete 27844 1726882772.06982: attempt loop complete, returning result 27844 1726882772.06984: _execute() done 27844 1726882772.06986: dumping result to json 27844 1726882772.06988: done dumping result, returning 27844 1726882772.06991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-efa9-466a-00000000064d] 27844 1726882772.06992: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064d 27844 1726882772.07138: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064d 27844 1726882772.07141: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882772.07201: no more pending results, returning what we have 27844 1726882772.07204: results queue empty 27844 1726882772.07205: checking for any_errors_fatal 27844 1726882772.07211: done checking for any_errors_fatal 27844 1726882772.07212: checking for max_fail_percentage 27844 1726882772.07214: done checking for max_fail_percentage 27844 1726882772.07214: checking to see if all hosts have failed and the running result is not ok 27844 1726882772.07215: done checking to see if all hosts have failed 27844 1726882772.07216: getting the remaining hosts for this loop 27844 1726882772.07217: done getting the remaining hosts for this loop 27844 1726882772.07221: getting the next task for host managed_node1 27844 1726882772.07227: done getting next task for host managed_node1 27844 1726882772.07231: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882772.07235: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882772.07245: getting variables 27844 1726882772.07247: in VariableManager get_vars() 27844 1726882772.07283: Calling all_inventory to load vars for managed_node1 27844 1726882772.07286: Calling groups_inventory to load vars for managed_node1 27844 1726882772.07288: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882772.07297: Calling all_plugins_play to load vars for managed_node1 27844 1726882772.07300: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882772.07303: Calling groups_plugins_play to load vars for managed_node1 27844 1726882772.10143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882772.13354: done with get_vars() 27844 1726882772.13812: done getting variables 27844 1726882772.13878: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:32 -0400 (0:00:00.596) 0:00:31.215 ****** 27844 1726882772.13912: entering _queue_task() for managed_node1/service 27844 1726882772.14220: worker is 1 (out of 1 available) 27844 1726882772.14231: exiting _queue_task() for managed_node1/service 27844 1726882772.14245: done queuing things up, now waiting for results queue to drain 27844 1726882772.14246: waiting for pending results... 27844 1726882772.14574: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27844 1726882772.14700: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064e 27844 1726882772.14714: variable 'ansible_search_path' from source: unknown 27844 1726882772.14717: variable 'ansible_search_path' from source: unknown 27844 1726882772.14753: calling self._execute() 27844 1726882772.14889: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.14893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.14910: variable 'omit' from source: magic vars 27844 1726882772.15291: variable 'ansible_distribution_major_version' from source: facts 27844 1726882772.15304: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882772.15451: variable 'network_provider' from source: set_fact 27844 1726882772.15465: Evaluated conditional (network_provider == "nm"): True 27844 1726882772.15582: variable '__network_wpa_supplicant_required' from source: role '' defaults 27844 1726882772.15656: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27844 1726882772.15826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882772.19499: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882772.19559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882772.19710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882772.19746: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882772.19774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882772.19979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882772.20007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882772.20148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882772.20190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882772.20205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882772.20369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882772.20390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882772.20414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882772.20453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882772.20585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882772.20625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882772.20648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882772.20790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882772.20828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882772.20842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882772.21224: variable 'network_connections' from source: include params 27844 1726882772.21237: variable 'interface0' from source: play vars 27844 1726882772.21314: variable 'interface0' from source: play vars 27844 1726882772.21325: variable 'interface1' from source: play vars 27844 1726882772.21499: variable 'interface1' from source: play vars 27844 1726882772.21687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27844 1726882772.21962: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27844 1726882772.22112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27844 1726882772.22140: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27844 1726882772.22171: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27844 1726882772.22324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27844 1726882772.22345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27844 1726882772.22371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882772.22395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27844 1726882772.22556: variable '__network_wireless_connections_defined' from source: role '' defaults 27844 1726882772.23034: variable 'network_connections' from source: include params 27844 1726882772.23037: variable 'interface0' from source: play vars 27844 1726882772.23350: variable 'interface0' from source: play vars 27844 1726882772.23353: variable 'interface1' from source: play vars 27844 1726882772.23355: variable 'interface1' from source: play vars 27844 1726882772.24389: Evaluated conditional (__network_wpa_supplicant_required): False 27844 1726882772.24392: when evaluation is False, skipping this task 27844 1726882772.24402: _execute() done 27844 1726882772.24405: dumping result to json 27844 1726882772.24408: done dumping result, returning 27844 1726882772.24410: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-efa9-466a-00000000064e] 27844 1726882772.24413: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064e 27844 1726882772.24510: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064e 27844 1726882772.24512: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27844 1726882772.24557: no more pending results, returning what we have 27844 1726882772.24560: results queue empty 27844 1726882772.24561: checking for any_errors_fatal 27844 1726882772.24587: done checking for any_errors_fatal 27844 1726882772.24588: checking for max_fail_percentage 27844 1726882772.24589: done checking for max_fail_percentage 27844 1726882772.24590: checking to see if all hosts have failed and the running result is not ok 27844 1726882772.24591: done checking to see if all hosts have failed 27844 1726882772.24592: getting the remaining hosts for this loop 27844 1726882772.24593: done getting the remaining hosts for this loop 27844 1726882772.24597: getting the next task for host managed_node1 27844 1726882772.24604: done getting next task for host managed_node1 27844 1726882772.24608: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882772.24612: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882772.24632: getting variables 27844 1726882772.24634: in VariableManager get_vars() 27844 1726882772.24677: Calling all_inventory to load vars for managed_node1 27844 1726882772.24680: Calling groups_inventory to load vars for managed_node1 27844 1726882772.24682: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882772.24690: Calling all_plugins_play to load vars for managed_node1 27844 1726882772.24693: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882772.24695: Calling groups_plugins_play to load vars for managed_node1 27844 1726882772.27207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882772.32138: done with get_vars() 27844 1726882772.32162: done getting variables 27844 1726882772.32229: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:32 -0400 (0:00:00.183) 0:00:31.399 ****** 27844 1726882772.32262: entering _queue_task() for managed_node1/service 27844 1726882772.32896: worker is 1 (out of 1 available) 27844 1726882772.32907: exiting _queue_task() for managed_node1/service 27844 1726882772.32920: done queuing things up, now waiting for results queue to drain 27844 1726882772.32922: waiting for pending results... 27844 1726882772.33545: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 27844 1726882772.33707: in run() - task 0e448fcc-3ce9-efa9-466a-00000000064f 27844 1726882772.33720: variable 'ansible_search_path' from source: unknown 27844 1726882772.33724: variable 'ansible_search_path' from source: unknown 27844 1726882772.33769: calling self._execute() 27844 1726882772.33860: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.33871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.33889: variable 'omit' from source: magic vars 27844 1726882772.34261: variable 'ansible_distribution_major_version' from source: facts 27844 1726882772.34277: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882772.34392: variable 'network_provider' from source: set_fact 27844 1726882772.34398: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882772.34401: when evaluation is False, skipping this task 27844 1726882772.34407: _execute() done 27844 1726882772.34410: dumping result to json 27844 1726882772.34413: done dumping result, returning 27844 1726882772.34420: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-efa9-466a-00000000064f] 27844 1726882772.34426: sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064f 27844 1726882772.34518: done sending task result for task 0e448fcc-3ce9-efa9-466a-00000000064f 27844 1726882772.34521: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27844 1726882772.34562: no more pending results, returning what we have 27844 1726882772.34571: results queue empty 27844 1726882772.34572: checking for any_errors_fatal 27844 1726882772.34580: done checking for any_errors_fatal 27844 1726882772.34580: checking for max_fail_percentage 27844 1726882772.34582: done checking for max_fail_percentage 27844 1726882772.34583: checking to see if all hosts have failed and the running result is not ok 27844 1726882772.34583: done checking to see if all hosts have failed 27844 1726882772.34584: getting the remaining hosts for this loop 27844 1726882772.34586: done getting the remaining hosts for this loop 27844 1726882772.34589: getting the next task for host managed_node1 27844 1726882772.34595: done getting next task for host managed_node1 27844 1726882772.34599: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882772.34603: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882772.34624: getting variables 27844 1726882772.34626: in VariableManager get_vars() 27844 1726882772.34662: Calling all_inventory to load vars for managed_node1 27844 1726882772.34669: Calling groups_inventory to load vars for managed_node1 27844 1726882772.34671: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882772.34680: Calling all_plugins_play to load vars for managed_node1 27844 1726882772.34683: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882772.34686: Calling groups_plugins_play to load vars for managed_node1 27844 1726882772.36403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882772.38230: done with get_vars() 27844 1726882772.38255: done getting variables 27844 1726882772.38319: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:32 -0400 (0:00:00.060) 0:00:31.460 ****** 27844 1726882772.38354: entering _queue_task() for managed_node1/copy 27844 1726882772.38874: worker is 1 (out of 1 available) 27844 1726882772.38885: exiting _queue_task() for managed_node1/copy 27844 1726882772.38898: done queuing things up, now waiting for results queue to drain 27844 1726882772.38899: waiting for pending results... 27844 1726882772.39737: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27844 1726882772.39903: in run() - task 0e448fcc-3ce9-efa9-466a-000000000650 27844 1726882772.39921: variable 'ansible_search_path' from source: unknown 27844 1726882772.39928: variable 'ansible_search_path' from source: unknown 27844 1726882772.39976: calling self._execute() 27844 1726882772.40082: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.40092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.40106: variable 'omit' from source: magic vars 27844 1726882772.40493: variable 'ansible_distribution_major_version' from source: facts 27844 1726882772.40515: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882772.40640: variable 'network_provider' from source: set_fact 27844 1726882772.40651: Evaluated conditional (network_provider == "initscripts"): False 27844 1726882772.40657: when evaluation is False, skipping this task 27844 1726882772.40665: _execute() done 27844 1726882772.40675: dumping result to json 27844 1726882772.40683: done dumping result, returning 27844 1726882772.40693: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-efa9-466a-000000000650] 27844 1726882772.40703: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000650 27844 1726882772.40815: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000650 27844 1726882772.40822: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27844 1726882772.40874: no more pending results, returning what we have 27844 1726882772.40878: results queue empty 27844 1726882772.40880: checking for any_errors_fatal 27844 1726882772.40886: done checking for any_errors_fatal 27844 1726882772.40887: checking for max_fail_percentage 27844 1726882772.40888: done checking for max_fail_percentage 27844 1726882772.40889: checking to see if all hosts have failed and the running result is not ok 27844 1726882772.40890: done checking to see if all hosts have failed 27844 1726882772.40891: getting the remaining hosts for this loop 27844 1726882772.40893: done getting the remaining hosts for this loop 27844 1726882772.40897: getting the next task for host managed_node1 27844 1726882772.40903: done getting next task for host managed_node1 27844 1726882772.40907: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882772.40912: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882772.40934: getting variables 27844 1726882772.40936: in VariableManager get_vars() 27844 1726882772.40984: Calling all_inventory to load vars for managed_node1 27844 1726882772.40987: Calling groups_inventory to load vars for managed_node1 27844 1726882772.40989: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882772.41002: Calling all_plugins_play to load vars for managed_node1 27844 1726882772.41005: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882772.41008: Calling groups_plugins_play to load vars for managed_node1 27844 1726882772.43014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882772.45639: done with get_vars() 27844 1726882772.45665: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:32 -0400 (0:00:00.073) 0:00:31.534 ****** 27844 1726882772.45752: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882772.46062: worker is 1 (out of 1 available) 27844 1726882772.46078: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 27844 1726882772.46091: done queuing things up, now waiting for results queue to drain 27844 1726882772.46092: waiting for pending results... 27844 1726882772.46371: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27844 1726882772.46506: in run() - task 0e448fcc-3ce9-efa9-466a-000000000651 27844 1726882772.46525: variable 'ansible_search_path' from source: unknown 27844 1726882772.46538: variable 'ansible_search_path' from source: unknown 27844 1726882772.46583: calling self._execute() 27844 1726882772.46692: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.46702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.46715: variable 'omit' from source: magic vars 27844 1726882772.47140: variable 'ansible_distribution_major_version' from source: facts 27844 1726882772.47169: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882772.47197: variable 'omit' from source: magic vars 27844 1726882772.47255: variable 'omit' from source: magic vars 27844 1726882772.48220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27844 1726882772.50671: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27844 1726882772.50741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27844 1726882772.50791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27844 1726882772.50831: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27844 1726882772.50859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27844 1726882772.50995: variable 'network_provider' from source: set_fact 27844 1726882772.51152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27844 1726882772.51182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27844 1726882772.51214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27844 1726882772.51259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27844 1726882772.51276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27844 1726882772.51358: variable 'omit' from source: magic vars 27844 1726882772.51477: variable 'omit' from source: magic vars 27844 1726882772.51587: variable 'network_connections' from source: include params 27844 1726882772.51598: variable 'interface0' from source: play vars 27844 1726882772.51671: variable 'interface0' from source: play vars 27844 1726882772.51679: variable 'interface1' from source: play vars 27844 1726882772.51745: variable 'interface1' from source: play vars 27844 1726882772.51903: variable 'omit' from source: magic vars 27844 1726882772.51911: variable '__lsr_ansible_managed' from source: task vars 27844 1726882772.51977: variable '__lsr_ansible_managed' from source: task vars 27844 1726882772.52257: Loaded config def from plugin (lookup/template) 27844 1726882772.52261: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27844 1726882772.52297: File lookup term: get_ansible_managed.j2 27844 1726882772.52301: variable 'ansible_search_path' from source: unknown 27844 1726882772.52304: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27844 1726882772.52319: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27844 1726882772.52334: variable 'ansible_search_path' from source: unknown 27844 1726882772.57849: variable 'ansible_managed' from source: unknown 27844 1726882772.57989: variable 'omit' from source: magic vars 27844 1726882772.58011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882772.58030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882772.58043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882772.58074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882772.58077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882772.58105: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882772.58108: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.58111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.58172: Set connection var ansible_shell_type to sh 27844 1726882772.58176: Set connection var ansible_connection to ssh 27844 1726882772.58181: Set connection var ansible_pipelining to False 27844 1726882772.58186: Set connection var ansible_timeout to 10 27844 1726882772.58191: Set connection var ansible_shell_executable to /bin/sh 27844 1726882772.58196: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882772.58219: variable 'ansible_shell_executable' from source: unknown 27844 1726882772.58222: variable 'ansible_connection' from source: unknown 27844 1726882772.58224: variable 'ansible_module_compression' from source: unknown 27844 1726882772.58227: variable 'ansible_shell_type' from source: unknown 27844 1726882772.58229: variable 'ansible_shell_executable' from source: unknown 27844 1726882772.58232: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882772.58243: variable 'ansible_pipelining' from source: unknown 27844 1726882772.58246: variable 'ansible_timeout' from source: unknown 27844 1726882772.58248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882772.58337: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882772.58347: variable 'omit' from source: magic vars 27844 1726882772.58352: starting attempt loop 27844 1726882772.58355: running the handler 27844 1726882772.58366: _low_level_execute_command(): starting 27844 1726882772.58376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882772.58839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.58847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.58882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882772.58895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882772.58905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.58946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882772.58952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.58978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.59087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.60759: stdout chunk (state=3): >>>/root <<< 27844 1726882772.60859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882772.60932: stderr chunk (state=3): >>><<< 27844 1726882772.60935: stdout chunk (state=3): >>><<< 27844 1726882772.61027: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882772.61030: _low_level_execute_command(): starting 27844 1726882772.61033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754 `" && echo ansible-tmp-1726882772.609513-29279-59081446683754="` echo /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754 `" ) && sleep 0' 27844 1726882772.61549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882772.61583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.61586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.61593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.61635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.61638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.61647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.61657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.61669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.61672: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882772.61682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.61748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882772.61777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.61779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.61877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.63730: stdout chunk (state=3): >>>ansible-tmp-1726882772.609513-29279-59081446683754=/root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754 <<< 27844 1726882772.63858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882772.63932: stderr chunk (state=3): >>><<< 27844 1726882772.63935: stdout chunk (state=3): >>><<< 27844 1726882772.64273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882772.609513-29279-59081446683754=/root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882772.64277: variable 'ansible_module_compression' from source: unknown 27844 1726882772.64279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27844 1726882772.64281: variable 'ansible_facts' from source: unknown 27844 1726882772.64283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/AnsiballZ_network_connections.py 27844 1726882772.64390: Sending initial data 27844 1726882772.64395: Sent initial data (166 bytes) 27844 1726882772.65284: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882772.65299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.65314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.65333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.65386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.65400: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882772.65417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.65435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882772.65447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882772.65463: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882772.65483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.65497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.65513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.65526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.65536: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882772.65548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.65632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882772.65648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.65661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.65816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.67495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 27844 1726882772.67503: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882772.67588: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882772.67683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp1zaqc8ni /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/AnsiballZ_network_connections.py <<< 27844 1726882772.67792: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882772.69709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882772.69818: stderr chunk (state=3): >>><<< 27844 1726882772.69822: stdout chunk (state=3): >>><<< 27844 1726882772.69838: done transferring module to remote 27844 1726882772.69848: _low_level_execute_command(): starting 27844 1726882772.69853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/ /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/AnsiballZ_network_connections.py && sleep 0' 27844 1726882772.70286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.70290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.70322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.70326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882772.70329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.70373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.70381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.70490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882772.72211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882772.72272: stderr chunk (state=3): >>><<< 27844 1726882772.72276: stdout chunk (state=3): >>><<< 27844 1726882772.72356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882772.72360: _low_level_execute_command(): starting 27844 1726882772.72362: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/AnsiballZ_network_connections.py && sleep 0' 27844 1726882772.72931: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882772.72944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.72957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.72977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.73022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.73033: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882772.73045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.73061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882772.73084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882772.73094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882772.73105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882772.73123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882772.73137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882772.73148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882772.73157: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882772.73172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882772.73253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882772.73275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882772.73290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882772.73415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.12619: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/abbd2acd-64d2-4926-8931-5a572400bc47: error=unknown <<< 27844 1726882773.14244: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 27844 1726882773.14290: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/3f556ad4-433c-441e-8cb1-58aca9efd828: error=unknown <<< 27844 1726882773.14476: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27844 1726882773.16018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882773.16075: stderr chunk (state=3): >>><<< 27844 1726882773.16079: stdout chunk (state=3): >>><<< 27844 1726882773.16096: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/abbd2acd-64d2-4926-8931-5a572400bc47: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_17ua4l5x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/3f556ad4-433c-441e-8cb1-58aca9efd828: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882773.16128: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882773.16136: _low_level_execute_command(): starting 27844 1726882773.16141: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882772.609513-29279-59081446683754/ > /dev/null 2>&1 && sleep 0' 27844 1726882773.16592: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.16596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.16631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.16634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.16636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.16697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.16701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.16705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.16798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.18592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.18640: stderr chunk (state=3): >>><<< 27844 1726882773.18643: stdout chunk (state=3): >>><<< 27844 1726882773.18656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.18661: handler run complete 27844 1726882773.18687: attempt loop complete, returning result 27844 1726882773.18694: _execute() done 27844 1726882773.18696: dumping result to json 27844 1726882773.18702: done dumping result, returning 27844 1726882773.18710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-efa9-466a-000000000651] 27844 1726882773.18714: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000651 27844 1726882773.18817: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000651 27844 1726882773.18820: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 27844 1726882773.18926: no more pending results, returning what we have 27844 1726882773.18931: results queue empty 27844 1726882773.18932: checking for any_errors_fatal 27844 1726882773.18938: done checking for any_errors_fatal 27844 1726882773.18938: checking for max_fail_percentage 27844 1726882773.18940: done checking for max_fail_percentage 27844 1726882773.18941: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.18942: done checking to see if all hosts have failed 27844 1726882773.18943: getting the remaining hosts for this loop 27844 1726882773.18944: done getting the remaining hosts for this loop 27844 1726882773.18948: getting the next task for host managed_node1 27844 1726882773.18953: done getting next task for host managed_node1 27844 1726882773.18957: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882773.18961: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.18980: getting variables 27844 1726882773.18982: in VariableManager get_vars() 27844 1726882773.19021: Calling all_inventory to load vars for managed_node1 27844 1726882773.19026: Calling groups_inventory to load vars for managed_node1 27844 1726882773.19028: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.19037: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.19040: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.19042: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.20516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.21476: done with get_vars() 27844 1726882773.21493: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:33 -0400 (0:00:00.758) 0:00:32.292 ****** 27844 1726882773.21553: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882773.21769: worker is 1 (out of 1 available) 27844 1726882773.21781: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 27844 1726882773.21795: done queuing things up, now waiting for results queue to drain 27844 1726882773.21797: waiting for pending results... 27844 1726882773.21973: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 27844 1726882773.22062: in run() - task 0e448fcc-3ce9-efa9-466a-000000000652 27844 1726882773.22075: variable 'ansible_search_path' from source: unknown 27844 1726882773.22078: variable 'ansible_search_path' from source: unknown 27844 1726882773.22107: calling self._execute() 27844 1726882773.22186: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.22190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.22198: variable 'omit' from source: magic vars 27844 1726882773.22580: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.22583: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.22870: variable 'network_state' from source: role '' defaults 27844 1726882773.22873: Evaluated conditional (network_state != {}): False 27844 1726882773.22876: when evaluation is False, skipping this task 27844 1726882773.22878: _execute() done 27844 1726882773.22880: dumping result to json 27844 1726882773.22881: done dumping result, returning 27844 1726882773.22884: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-efa9-466a-000000000652] 27844 1726882773.22886: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000652 27844 1726882773.22942: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000652 27844 1726882773.22944: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27844 1726882773.23256: no more pending results, returning what we have 27844 1726882773.23259: results queue empty 27844 1726882773.23260: checking for any_errors_fatal 27844 1726882773.23271: done checking for any_errors_fatal 27844 1726882773.23272: checking for max_fail_percentage 27844 1726882773.23274: done checking for max_fail_percentage 27844 1726882773.23275: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.23275: done checking to see if all hosts have failed 27844 1726882773.23276: getting the remaining hosts for this loop 27844 1726882773.23277: done getting the remaining hosts for this loop 27844 1726882773.23280: getting the next task for host managed_node1 27844 1726882773.23286: done getting next task for host managed_node1 27844 1726882773.23290: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882773.23295: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.23312: getting variables 27844 1726882773.23313: in VariableManager get_vars() 27844 1726882773.23348: Calling all_inventory to load vars for managed_node1 27844 1726882773.23350: Calling groups_inventory to load vars for managed_node1 27844 1726882773.23352: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.23360: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.23363: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.23369: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.24818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.26517: done with get_vars() 27844 1726882773.26538: done getting variables 27844 1726882773.26598: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:33 -0400 (0:00:00.050) 0:00:32.342 ****** 27844 1726882773.26629: entering _queue_task() for managed_node1/debug 27844 1726882773.26895: worker is 1 (out of 1 available) 27844 1726882773.26908: exiting _queue_task() for managed_node1/debug 27844 1726882773.26921: done queuing things up, now waiting for results queue to drain 27844 1726882773.26923: waiting for pending results... 27844 1726882773.27235: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27844 1726882773.27362: in run() - task 0e448fcc-3ce9-efa9-466a-000000000653 27844 1726882773.27385: variable 'ansible_search_path' from source: unknown 27844 1726882773.27389: variable 'ansible_search_path' from source: unknown 27844 1726882773.27425: calling self._execute() 27844 1726882773.27524: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.27528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.27539: variable 'omit' from source: magic vars 27844 1726882773.27927: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.27940: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.27946: variable 'omit' from source: magic vars 27844 1726882773.28005: variable 'omit' from source: magic vars 27844 1726882773.28041: variable 'omit' from source: magic vars 27844 1726882773.28085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882773.28119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882773.28142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882773.28160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.28174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.28202: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882773.28206: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.28208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.28315: Set connection var ansible_shell_type to sh 27844 1726882773.28318: Set connection var ansible_connection to ssh 27844 1726882773.28324: Set connection var ansible_pipelining to False 27844 1726882773.28329: Set connection var ansible_timeout to 10 27844 1726882773.28335: Set connection var ansible_shell_executable to /bin/sh 27844 1726882773.28340: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882773.28372: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.28375: variable 'ansible_connection' from source: unknown 27844 1726882773.28378: variable 'ansible_module_compression' from source: unknown 27844 1726882773.28381: variable 'ansible_shell_type' from source: unknown 27844 1726882773.28383: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.28385: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.28389: variable 'ansible_pipelining' from source: unknown 27844 1726882773.28392: variable 'ansible_timeout' from source: unknown 27844 1726882773.28396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.28537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882773.28549: variable 'omit' from source: magic vars 27844 1726882773.28554: starting attempt loop 27844 1726882773.28557: running the handler 27844 1726882773.28694: variable '__network_connections_result' from source: set_fact 27844 1726882773.28744: handler run complete 27844 1726882773.28760: attempt loop complete, returning result 27844 1726882773.28765: _execute() done 27844 1726882773.28770: dumping result to json 27844 1726882773.28773: done dumping result, returning 27844 1726882773.28779: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-efa9-466a-000000000653] 27844 1726882773.28785: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000653 27844 1726882773.28884: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000653 27844 1726882773.28887: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 27844 1726882773.28965: no more pending results, returning what we have 27844 1726882773.28969: results queue empty 27844 1726882773.28970: checking for any_errors_fatal 27844 1726882773.28978: done checking for any_errors_fatal 27844 1726882773.28979: checking for max_fail_percentage 27844 1726882773.28981: done checking for max_fail_percentage 27844 1726882773.28982: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.28983: done checking to see if all hosts have failed 27844 1726882773.28984: getting the remaining hosts for this loop 27844 1726882773.28986: done getting the remaining hosts for this loop 27844 1726882773.28990: getting the next task for host managed_node1 27844 1726882773.28997: done getting next task for host managed_node1 27844 1726882773.29003: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882773.29008: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.29020: getting variables 27844 1726882773.29022: in VariableManager get_vars() 27844 1726882773.29065: Calling all_inventory to load vars for managed_node1 27844 1726882773.29069: Calling groups_inventory to load vars for managed_node1 27844 1726882773.29071: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.29083: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.29086: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.29089: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.30650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.32437: done with get_vars() 27844 1726882773.32458: done getting variables 27844 1726882773.32516: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:33 -0400 (0:00:00.059) 0:00:32.402 ****** 27844 1726882773.32549: entering _queue_task() for managed_node1/debug 27844 1726882773.32823: worker is 1 (out of 1 available) 27844 1726882773.32837: exiting _queue_task() for managed_node1/debug 27844 1726882773.32850: done queuing things up, now waiting for results queue to drain 27844 1726882773.32852: waiting for pending results... 27844 1726882773.33137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27844 1726882773.33243: in run() - task 0e448fcc-3ce9-efa9-466a-000000000654 27844 1726882773.33254: variable 'ansible_search_path' from source: unknown 27844 1726882773.33257: variable 'ansible_search_path' from source: unknown 27844 1726882773.33293: calling self._execute() 27844 1726882773.33388: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.33392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.33402: variable 'omit' from source: magic vars 27844 1726882773.33772: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.33782: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.33789: variable 'omit' from source: magic vars 27844 1726882773.33855: variable 'omit' from source: magic vars 27844 1726882773.33888: variable 'omit' from source: magic vars 27844 1726882773.33928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882773.33968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882773.33984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882773.34001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.34012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.34041: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882773.34044: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.34047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.34146: Set connection var ansible_shell_type to sh 27844 1726882773.34149: Set connection var ansible_connection to ssh 27844 1726882773.34154: Set connection var ansible_pipelining to False 27844 1726882773.34160: Set connection var ansible_timeout to 10 27844 1726882773.34170: Set connection var ansible_shell_executable to /bin/sh 27844 1726882773.34177: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882773.34202: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.34205: variable 'ansible_connection' from source: unknown 27844 1726882773.34208: variable 'ansible_module_compression' from source: unknown 27844 1726882773.34210: variable 'ansible_shell_type' from source: unknown 27844 1726882773.34213: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.34217: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.34221: variable 'ansible_pipelining' from source: unknown 27844 1726882773.34223: variable 'ansible_timeout' from source: unknown 27844 1726882773.34227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.34357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882773.34371: variable 'omit' from source: magic vars 27844 1726882773.34377: starting attempt loop 27844 1726882773.34386: running the handler 27844 1726882773.34434: variable '__network_connections_result' from source: set_fact 27844 1726882773.34515: variable '__network_connections_result' from source: set_fact 27844 1726882773.34631: handler run complete 27844 1726882773.34656: attempt loop complete, returning result 27844 1726882773.34659: _execute() done 27844 1726882773.34661: dumping result to json 27844 1726882773.34670: done dumping result, returning 27844 1726882773.34676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-efa9-466a-000000000654] 27844 1726882773.34681: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000654 27844 1726882773.34779: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000654 27844 1726882773.34782: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 27844 1726882773.34905: no more pending results, returning what we have 27844 1726882773.34909: results queue empty 27844 1726882773.34910: checking for any_errors_fatal 27844 1726882773.34916: done checking for any_errors_fatal 27844 1726882773.34916: checking for max_fail_percentage 27844 1726882773.34919: done checking for max_fail_percentage 27844 1726882773.34920: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.34921: done checking to see if all hosts have failed 27844 1726882773.34921: getting the remaining hosts for this loop 27844 1726882773.34923: done getting the remaining hosts for this loop 27844 1726882773.34927: getting the next task for host managed_node1 27844 1726882773.34933: done getting next task for host managed_node1 27844 1726882773.34937: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882773.34942: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.34952: getting variables 27844 1726882773.34954: in VariableManager get_vars() 27844 1726882773.34997: Calling all_inventory to load vars for managed_node1 27844 1726882773.35000: Calling groups_inventory to load vars for managed_node1 27844 1726882773.35003: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.35014: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.35017: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.35020: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.36546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.38209: done with get_vars() 27844 1726882773.38230: done getting variables 27844 1726882773.38288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:33 -0400 (0:00:00.057) 0:00:32.459 ****** 27844 1726882773.38321: entering _queue_task() for managed_node1/debug 27844 1726882773.38582: worker is 1 (out of 1 available) 27844 1726882773.38594: exiting _queue_task() for managed_node1/debug 27844 1726882773.38606: done queuing things up, now waiting for results queue to drain 27844 1726882773.38608: waiting for pending results... 27844 1726882773.38886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27844 1726882773.39004: in run() - task 0e448fcc-3ce9-efa9-466a-000000000655 27844 1726882773.39017: variable 'ansible_search_path' from source: unknown 27844 1726882773.39021: variable 'ansible_search_path' from source: unknown 27844 1726882773.39059: calling self._execute() 27844 1726882773.39152: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.39162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.39175: variable 'omit' from source: magic vars 27844 1726882773.39534: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.39546: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.39670: variable 'network_state' from source: role '' defaults 27844 1726882773.39678: Evaluated conditional (network_state != {}): False 27844 1726882773.39681: when evaluation is False, skipping this task 27844 1726882773.39684: _execute() done 27844 1726882773.39686: dumping result to json 27844 1726882773.39688: done dumping result, returning 27844 1726882773.39695: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-efa9-466a-000000000655] 27844 1726882773.39700: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000655 27844 1726882773.39796: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000655 27844 1726882773.39798: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 27844 1726882773.39857: no more pending results, returning what we have 27844 1726882773.39861: results queue empty 27844 1726882773.39863: checking for any_errors_fatal 27844 1726882773.39874: done checking for any_errors_fatal 27844 1726882773.39874: checking for max_fail_percentage 27844 1726882773.39877: done checking for max_fail_percentage 27844 1726882773.39878: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.39879: done checking to see if all hosts have failed 27844 1726882773.39880: getting the remaining hosts for this loop 27844 1726882773.39882: done getting the remaining hosts for this loop 27844 1726882773.39885: getting the next task for host managed_node1 27844 1726882773.39891: done getting next task for host managed_node1 27844 1726882773.39895: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882773.39900: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.39922: getting variables 27844 1726882773.39924: in VariableManager get_vars() 27844 1726882773.39967: Calling all_inventory to load vars for managed_node1 27844 1726882773.39970: Calling groups_inventory to load vars for managed_node1 27844 1726882773.39973: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.39985: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.39989: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.39992: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.45647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.47218: done with get_vars() 27844 1726882773.47244: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:33 -0400 (0:00:00.090) 0:00:32.549 ****** 27844 1726882773.47329: entering _queue_task() for managed_node1/ping 27844 1726882773.47662: worker is 1 (out of 1 available) 27844 1726882773.47676: exiting _queue_task() for managed_node1/ping 27844 1726882773.47689: done queuing things up, now waiting for results queue to drain 27844 1726882773.47691: waiting for pending results... 27844 1726882773.47995: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 27844 1726882773.48128: in run() - task 0e448fcc-3ce9-efa9-466a-000000000656 27844 1726882773.48145: variable 'ansible_search_path' from source: unknown 27844 1726882773.48150: variable 'ansible_search_path' from source: unknown 27844 1726882773.48190: calling self._execute() 27844 1726882773.48290: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.48294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.48305: variable 'omit' from source: magic vars 27844 1726882773.48698: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.48712: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.48716: variable 'omit' from source: magic vars 27844 1726882773.48771: variable 'omit' from source: magic vars 27844 1726882773.48808: variable 'omit' from source: magic vars 27844 1726882773.48847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882773.48884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882773.48906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882773.48923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.48936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.48965: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882773.48971: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.48974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.49080: Set connection var ansible_shell_type to sh 27844 1726882773.49083: Set connection var ansible_connection to ssh 27844 1726882773.49088: Set connection var ansible_pipelining to False 27844 1726882773.49094: Set connection var ansible_timeout to 10 27844 1726882773.49100: Set connection var ansible_shell_executable to /bin/sh 27844 1726882773.49107: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882773.49139: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.49143: variable 'ansible_connection' from source: unknown 27844 1726882773.49146: variable 'ansible_module_compression' from source: unknown 27844 1726882773.49148: variable 'ansible_shell_type' from source: unknown 27844 1726882773.49151: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.49153: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.49158: variable 'ansible_pipelining' from source: unknown 27844 1726882773.49160: variable 'ansible_timeout' from source: unknown 27844 1726882773.49163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.49350: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882773.49360: variable 'omit' from source: magic vars 27844 1726882773.49369: starting attempt loop 27844 1726882773.49372: running the handler 27844 1726882773.49382: _low_level_execute_command(): starting 27844 1726882773.49391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882773.50112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.50122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.50132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.50147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.50188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.50197: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.50207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.50223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.50231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.50239: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.50247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.50259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.50272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.50279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.50286: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.50297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.50374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.50393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.50406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.50533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.52190: stdout chunk (state=3): >>>/root <<< 27844 1726882773.52297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.52356: stderr chunk (state=3): >>><<< 27844 1726882773.52359: stdout chunk (state=3): >>><<< 27844 1726882773.52389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.52399: _low_level_execute_command(): starting 27844 1726882773.52406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808 `" && echo ansible-tmp-1726882773.5238671-29319-120831419061808="` echo /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808 `" ) && sleep 0' 27844 1726882773.53002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.53010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.53020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.53033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.53072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.53079: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.53089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.53103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.53110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.53113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.53122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.53131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.53142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.53148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.53155: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.53165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.53237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.53251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.53256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.53388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.55260: stdout chunk (state=3): >>>ansible-tmp-1726882773.5238671-29319-120831419061808=/root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808 <<< 27844 1726882773.55376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.55444: stderr chunk (state=3): >>><<< 27844 1726882773.55447: stdout chunk (state=3): >>><<< 27844 1726882773.55468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882773.5238671-29319-120831419061808=/root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.55509: variable 'ansible_module_compression' from source: unknown 27844 1726882773.55548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27844 1726882773.55585: variable 'ansible_facts' from source: unknown 27844 1726882773.55658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/AnsiballZ_ping.py 27844 1726882773.55791: Sending initial data 27844 1726882773.55794: Sent initial data (153 bytes) 27844 1726882773.56672: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.56681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.56693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.56705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.56743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.56750: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.56759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.56780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.56786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.56792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.56800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.56809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.56821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.56828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.56834: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.56842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.56924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.56931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.56945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.57071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.58794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882773.58890: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882773.58988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp3975xdql /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/AnsiballZ_ping.py <<< 27844 1726882773.59083: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882773.60303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.60430: stderr chunk (state=3): >>><<< 27844 1726882773.60433: stdout chunk (state=3): >>><<< 27844 1726882773.60453: done transferring module to remote 27844 1726882773.60462: _low_level_execute_command(): starting 27844 1726882773.60473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/ /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/AnsiballZ_ping.py && sleep 0' 27844 1726882773.61131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.61139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.61150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.61163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.61201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.61210: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.61218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.61231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.61238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.61244: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.61251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.61260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.61273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.61280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.61286: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.61296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.61370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.61382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.61393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.61519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.63226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.63276: stderr chunk (state=3): >>><<< 27844 1726882773.63279: stdout chunk (state=3): >>><<< 27844 1726882773.63292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.63295: _low_level_execute_command(): starting 27844 1726882773.63299: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/AnsiballZ_ping.py && sleep 0' 27844 1726882773.63914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.63924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.63937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.63950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.63990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.63997: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.64008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.64025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.64031: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.64043: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.64050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.64058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.64073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.64081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.64088: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.64096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.64185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.64192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.64197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.64334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.77101: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27844 1726882773.78100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882773.78148: stderr chunk (state=3): >>><<< 27844 1726882773.78152: stdout chunk (state=3): >>><<< 27844 1726882773.78170: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882773.78189: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882773.78195: _low_level_execute_command(): starting 27844 1726882773.78201: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882773.5238671-29319-120831419061808/ > /dev/null 2>&1 && sleep 0' 27844 1726882773.78635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.78638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.78678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.78682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882773.78685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.78732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.78735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.78834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.80639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.80680: stderr chunk (state=3): >>><<< 27844 1726882773.80684: stdout chunk (state=3): >>><<< 27844 1726882773.80698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.80703: handler run complete 27844 1726882773.80719: attempt loop complete, returning result 27844 1726882773.80722: _execute() done 27844 1726882773.80724: dumping result to json 27844 1726882773.80726: done dumping result, returning 27844 1726882773.80734: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-efa9-466a-000000000656] 27844 1726882773.80738: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000656 27844 1726882773.80830: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000656 27844 1726882773.80833: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 27844 1726882773.80894: no more pending results, returning what we have 27844 1726882773.80898: results queue empty 27844 1726882773.80898: checking for any_errors_fatal 27844 1726882773.80906: done checking for any_errors_fatal 27844 1726882773.80907: checking for max_fail_percentage 27844 1726882773.80908: done checking for max_fail_percentage 27844 1726882773.80909: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.80910: done checking to see if all hosts have failed 27844 1726882773.80910: getting the remaining hosts for this loop 27844 1726882773.80912: done getting the remaining hosts for this loop 27844 1726882773.80917: getting the next task for host managed_node1 27844 1726882773.80925: done getting next task for host managed_node1 27844 1726882773.80927: ^ task is: TASK: meta (role_complete) 27844 1726882773.80931: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.80943: getting variables 27844 1726882773.80945: in VariableManager get_vars() 27844 1726882773.80991: Calling all_inventory to load vars for managed_node1 27844 1726882773.80993: Calling groups_inventory to load vars for managed_node1 27844 1726882773.80996: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.81006: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.81009: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.81011: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.81845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.82780: done with get_vars() 27844 1726882773.82795: done getting variables 27844 1726882773.82853: done queuing things up, now waiting for results queue to drain 27844 1726882773.82854: results queue empty 27844 1726882773.82855: checking for any_errors_fatal 27844 1726882773.82857: done checking for any_errors_fatal 27844 1726882773.82857: checking for max_fail_percentage 27844 1726882773.82858: done checking for max_fail_percentage 27844 1726882773.82858: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.82859: done checking to see if all hosts have failed 27844 1726882773.82859: getting the remaining hosts for this loop 27844 1726882773.82860: done getting the remaining hosts for this loop 27844 1726882773.82862: getting the next task for host managed_node1 27844 1726882773.82867: done getting next task for host managed_node1 27844 1726882773.82869: ^ task is: TASK: Delete interface1 27844 1726882773.82870: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.82872: getting variables 27844 1726882773.82873: in VariableManager get_vars() 27844 1726882773.82882: Calling all_inventory to load vars for managed_node1 27844 1726882773.82887: Calling groups_inventory to load vars for managed_node1 27844 1726882773.82888: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.82891: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.82893: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.82894: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.83618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.84541: done with get_vars() 27844 1726882773.84554: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Friday 20 September 2024 21:39:33 -0400 (0:00:00.372) 0:00:32.922 ****** 27844 1726882773.84605: entering _queue_task() for managed_node1/include_tasks 27844 1726882773.84855: worker is 1 (out of 1 available) 27844 1726882773.84869: exiting _queue_task() for managed_node1/include_tasks 27844 1726882773.84883: done queuing things up, now waiting for results queue to drain 27844 1726882773.84885: waiting for pending results... 27844 1726882773.85069: running TaskExecutor() for managed_node1/TASK: Delete interface1 27844 1726882773.85147: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b5 27844 1726882773.85158: variable 'ansible_search_path' from source: unknown 27844 1726882773.85196: calling self._execute() 27844 1726882773.85273: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.85277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.85288: variable 'omit' from source: magic vars 27844 1726882773.85568: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.85580: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.85586: _execute() done 27844 1726882773.85589: dumping result to json 27844 1726882773.85592: done dumping result, returning 27844 1726882773.85598: done running TaskExecutor() for managed_node1/TASK: Delete interface1 [0e448fcc-3ce9-efa9-466a-0000000000b5] 27844 1726882773.85602: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b5 27844 1726882773.85702: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b5 27844 1726882773.85704: WORKER PROCESS EXITING 27844 1726882773.85734: no more pending results, returning what we have 27844 1726882773.85738: in VariableManager get_vars() 27844 1726882773.85788: Calling all_inventory to load vars for managed_node1 27844 1726882773.85791: Calling groups_inventory to load vars for managed_node1 27844 1726882773.85793: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.85802: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.85805: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.85807: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.86561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.87593: done with get_vars() 27844 1726882773.87605: variable 'ansible_search_path' from source: unknown 27844 1726882773.87614: we have included files to process 27844 1726882773.87615: generating all_blocks data 27844 1726882773.87617: done generating all_blocks data 27844 1726882773.87621: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882773.87622: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882773.87625: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882773.87777: done processing included file 27844 1726882773.87779: iterating over new_blocks loaded from include file 27844 1726882773.87780: in VariableManager get_vars() 27844 1726882773.87792: done with get_vars() 27844 1726882773.87793: filtering new block on tags 27844 1726882773.87811: done filtering new block on tags 27844 1726882773.87812: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 27844 1726882773.87816: extending task lists for all hosts with included blocks 27844 1726882773.88726: done extending task lists 27844 1726882773.88727: done processing included files 27844 1726882773.88728: results queue empty 27844 1726882773.88729: checking for any_errors_fatal 27844 1726882773.88731: done checking for any_errors_fatal 27844 1726882773.88731: checking for max_fail_percentage 27844 1726882773.88733: done checking for max_fail_percentage 27844 1726882773.88733: checking to see if all hosts have failed and the running result is not ok 27844 1726882773.88734: done checking to see if all hosts have failed 27844 1726882773.88735: getting the remaining hosts for this loop 27844 1726882773.88736: done getting the remaining hosts for this loop 27844 1726882773.88739: getting the next task for host managed_node1 27844 1726882773.88743: done getting next task for host managed_node1 27844 1726882773.88745: ^ task is: TASK: Remove test interface if necessary 27844 1726882773.88748: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882773.88750: getting variables 27844 1726882773.88751: in VariableManager get_vars() 27844 1726882773.88766: Calling all_inventory to load vars for managed_node1 27844 1726882773.88768: Calling groups_inventory to load vars for managed_node1 27844 1726882773.88770: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882773.88775: Calling all_plugins_play to load vars for managed_node1 27844 1726882773.88778: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882773.88781: Calling groups_plugins_play to load vars for managed_node1 27844 1726882773.89820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882773.90722: done with get_vars() 27844 1726882773.90736: done getting variables 27844 1726882773.90762: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:39:33 -0400 (0:00:00.061) 0:00:32.984 ****** 27844 1726882773.90786: entering _queue_task() for managed_node1/command 27844 1726882773.90978: worker is 1 (out of 1 available) 27844 1726882773.90991: exiting _queue_task() for managed_node1/command 27844 1726882773.91006: done queuing things up, now waiting for results queue to drain 27844 1726882773.91007: waiting for pending results... 27844 1726882773.91181: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 27844 1726882773.91261: in run() - task 0e448fcc-3ce9-efa9-466a-000000000777 27844 1726882773.91277: variable 'ansible_search_path' from source: unknown 27844 1726882773.91280: variable 'ansible_search_path' from source: unknown 27844 1726882773.91307: calling self._execute() 27844 1726882773.91385: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.91389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.91397: variable 'omit' from source: magic vars 27844 1726882773.91851: variable 'ansible_distribution_major_version' from source: facts 27844 1726882773.91870: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882773.91882: variable 'omit' from source: magic vars 27844 1726882773.91926: variable 'omit' from source: magic vars 27844 1726882773.92022: variable 'interface' from source: set_fact 27844 1726882773.92045: variable 'omit' from source: magic vars 27844 1726882773.92092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882773.92131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882773.92155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882773.92182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.92197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882773.92227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882773.92235: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.92242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.92339: Set connection var ansible_shell_type to sh 27844 1726882773.92346: Set connection var ansible_connection to ssh 27844 1726882773.92355: Set connection var ansible_pipelining to False 27844 1726882773.92368: Set connection var ansible_timeout to 10 27844 1726882773.92378: Set connection var ansible_shell_executable to /bin/sh 27844 1726882773.92385: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882773.92410: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.92416: variable 'ansible_connection' from source: unknown 27844 1726882773.92421: variable 'ansible_module_compression' from source: unknown 27844 1726882773.92426: variable 'ansible_shell_type' from source: unknown 27844 1726882773.92430: variable 'ansible_shell_executable' from source: unknown 27844 1726882773.92435: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882773.92441: variable 'ansible_pipelining' from source: unknown 27844 1726882773.92446: variable 'ansible_timeout' from source: unknown 27844 1726882773.92452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882773.92583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882773.92599: variable 'omit' from source: magic vars 27844 1726882773.92608: starting attempt loop 27844 1726882773.92614: running the handler 27844 1726882773.92634: _low_level_execute_command(): starting 27844 1726882773.92646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882773.93347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.93369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.93386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.93404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.93443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.93455: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.93479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.93497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.93508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.93518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.93531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.93545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.93559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.93578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.93584: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.93595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.93668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.93689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.93700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.93822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.95432: stdout chunk (state=3): >>>/root <<< 27844 1726882773.95584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.95625: stderr chunk (state=3): >>><<< 27844 1726882773.95629: stdout chunk (state=3): >>><<< 27844 1726882773.95673: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.95678: _low_level_execute_command(): starting 27844 1726882773.95751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174 `" && echo ansible-tmp-1726882773.9565024-29336-152138201774174="` echo /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174 `" ) && sleep 0' 27844 1726882773.96333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882773.96348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.96369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.96389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.96435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.96448: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882773.96462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.96486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882773.96499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882773.96512: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882773.96527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882773.96541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882773.96558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882773.96577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882773.96589: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882773.96604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882773.96685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882773.96702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882773.96716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882773.96843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882773.98706: stdout chunk (state=3): >>>ansible-tmp-1726882773.9565024-29336-152138201774174=/root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174 <<< 27844 1726882773.98810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882773.98868: stderr chunk (state=3): >>><<< 27844 1726882773.98871: stdout chunk (state=3): >>><<< 27844 1726882773.98970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882773.9565024-29336-152138201774174=/root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882773.98974: variable 'ansible_module_compression' from source: unknown 27844 1726882773.99069: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882773.99072: variable 'ansible_facts' from source: unknown 27844 1726882773.99100: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/AnsiballZ_command.py 27844 1726882773.99244: Sending initial data 27844 1726882773.99247: Sent initial data (156 bytes) 27844 1726882774.00152: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.00168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.00183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.00200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.00238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.00249: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.00261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.00281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.00293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.00302: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.00312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.00325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.00341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.00352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.00362: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.00378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.00451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.00469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.00483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.00669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.02415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882774.02501: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882774.02622: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmphxdakiv4 /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/AnsiballZ_command.py <<< 27844 1726882774.02713: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882774.04202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.04336: stderr chunk (state=3): >>><<< 27844 1726882774.04339: stdout chunk (state=3): >>><<< 27844 1726882774.04430: done transferring module to remote 27844 1726882774.04439: _low_level_execute_command(): starting 27844 1726882774.04442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/ /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/AnsiballZ_command.py && sleep 0' 27844 1726882774.06508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.06526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.06542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.06560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.06604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.06682: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.06698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.06716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.06729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.06745: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.06759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.06776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.06793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.06806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.06818: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.06832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.06922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.07090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.07106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.07228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.08988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.09040: stderr chunk (state=3): >>><<< 27844 1726882774.09044: stdout chunk (state=3): >>><<< 27844 1726882774.09133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.09137: _low_level_execute_command(): starting 27844 1726882774.09140: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/AnsiballZ_command.py && sleep 0' 27844 1726882774.10723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.10785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.10800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.10819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.10976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.10990: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.11005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.11022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.11034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.11048: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.11060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.11080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.11097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.11110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.11121: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.11134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.11216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.11289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.11305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.11437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.26420: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 21:39:34.243729", "end": "2024-09-20 21:39:34.260971", "delta": "0:00:00.017242", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882774.27614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882774.27618: stdout chunk (state=3): >>><<< 27844 1726882774.27620: stderr chunk (state=3): >>><<< 27844 1726882774.27771: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-20 21:39:34.243729", "end": "2024-09-20 21:39:34.260971", "delta": "0:00:00.017242", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882774.27780: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882774.27784: _low_level_execute_command(): starting 27844 1726882774.27786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882773.9565024-29336-152138201774174/ > /dev/null 2>&1 && sleep 0' 27844 1726882774.29429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.29445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.29460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.29486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.29530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.29545: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.29559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.29583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.29685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.29698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.29711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.29727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.29745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.29758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.29775: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.29790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.29868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.29892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.29906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.30181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.31981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.32006: stderr chunk (state=3): >>><<< 27844 1726882774.32009: stdout chunk (state=3): >>><<< 27844 1726882774.32071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.32074: handler run complete 27844 1726882774.32076: Evaluated conditional (False): False 27844 1726882774.32078: attempt loop complete, returning result 27844 1726882774.32080: _execute() done 27844 1726882774.32081: dumping result to json 27844 1726882774.32172: done dumping result, returning 27844 1726882774.32175: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0e448fcc-3ce9-efa9-466a-000000000777] 27844 1726882774.32177: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000777 27844 1726882774.32258: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000777 27844 1726882774.32261: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.017242", "end": "2024-09-20 21:39:34.260971", "rc": 0, "start": "2024-09-20 21:39:34.243729" } 27844 1726882774.32339: no more pending results, returning what we have 27844 1726882774.32343: results queue empty 27844 1726882774.32344: checking for any_errors_fatal 27844 1726882774.32346: done checking for any_errors_fatal 27844 1726882774.32346: checking for max_fail_percentage 27844 1726882774.32348: done checking for max_fail_percentage 27844 1726882774.32349: checking to see if all hosts have failed and the running result is not ok 27844 1726882774.32350: done checking to see if all hosts have failed 27844 1726882774.32350: getting the remaining hosts for this loop 27844 1726882774.32352: done getting the remaining hosts for this loop 27844 1726882774.32356: getting the next task for host managed_node1 27844 1726882774.32368: done getting next task for host managed_node1 27844 1726882774.32372: ^ task is: TASK: Assert interface1 is absent 27844 1726882774.32376: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882774.32382: getting variables 27844 1726882774.32387: in VariableManager get_vars() 27844 1726882774.32440: Calling all_inventory to load vars for managed_node1 27844 1726882774.32443: Calling groups_inventory to load vars for managed_node1 27844 1726882774.32446: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.32458: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.32461: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.32572: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.35198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.36949: done with get_vars() 27844 1726882774.36980: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Friday 20 September 2024 21:39:34 -0400 (0:00:00.463) 0:00:33.447 ****** 27844 1726882774.37094: entering _queue_task() for managed_node1/include_tasks 27844 1726882774.37426: worker is 1 (out of 1 available) 27844 1726882774.37441: exiting _queue_task() for managed_node1/include_tasks 27844 1726882774.37455: done queuing things up, now waiting for results queue to drain 27844 1726882774.37456: waiting for pending results... 27844 1726882774.38386: running TaskExecutor() for managed_node1/TASK: Assert interface1 is absent 27844 1726882774.38508: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b6 27844 1726882774.38529: variable 'ansible_search_path' from source: unknown 27844 1726882774.38577: calling self._execute() 27844 1726882774.38694: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.38704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.38721: variable 'omit' from source: magic vars 27844 1726882774.39140: variable 'ansible_distribution_major_version' from source: facts 27844 1726882774.39169: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882774.39182: _execute() done 27844 1726882774.39190: dumping result to json 27844 1726882774.39197: done dumping result, returning 27844 1726882774.39207: done running TaskExecutor() for managed_node1/TASK: Assert interface1 is absent [0e448fcc-3ce9-efa9-466a-0000000000b6] 27844 1726882774.39216: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b6 27844 1726882774.39334: no more pending results, returning what we have 27844 1726882774.39340: in VariableManager get_vars() 27844 1726882774.39392: Calling all_inventory to load vars for managed_node1 27844 1726882774.39395: Calling groups_inventory to load vars for managed_node1 27844 1726882774.39398: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.39413: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.39416: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.39419: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.40583: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b6 27844 1726882774.40587: WORKER PROCESS EXITING 27844 1726882774.41149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.43606: done with get_vars() 27844 1726882774.43620: variable 'ansible_search_path' from source: unknown 27844 1726882774.43630: we have included files to process 27844 1726882774.43631: generating all_blocks data 27844 1726882774.43632: done generating all_blocks data 27844 1726882774.43637: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882774.43638: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882774.43640: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882774.43751: in VariableManager get_vars() 27844 1726882774.43771: done with get_vars() 27844 1726882774.43849: done processing included file 27844 1726882774.43851: iterating over new_blocks loaded from include file 27844 1726882774.43852: in VariableManager get_vars() 27844 1726882774.43865: done with get_vars() 27844 1726882774.43868: filtering new block on tags 27844 1726882774.43892: done filtering new block on tags 27844 1726882774.43893: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 27844 1726882774.43897: extending task lists for all hosts with included blocks 27844 1726882774.45172: done extending task lists 27844 1726882774.45173: done processing included files 27844 1726882774.45174: results queue empty 27844 1726882774.45175: checking for any_errors_fatal 27844 1726882774.45182: done checking for any_errors_fatal 27844 1726882774.45183: checking for max_fail_percentage 27844 1726882774.45184: done checking for max_fail_percentage 27844 1726882774.45185: checking to see if all hosts have failed and the running result is not ok 27844 1726882774.45186: done checking to see if all hosts have failed 27844 1726882774.45186: getting the remaining hosts for this loop 27844 1726882774.45188: done getting the remaining hosts for this loop 27844 1726882774.45190: getting the next task for host managed_node1 27844 1726882774.45193: done getting next task for host managed_node1 27844 1726882774.45196: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27844 1726882774.45199: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882774.45202: getting variables 27844 1726882774.45203: in VariableManager get_vars() 27844 1726882774.45221: Calling all_inventory to load vars for managed_node1 27844 1726882774.45223: Calling groups_inventory to load vars for managed_node1 27844 1726882774.45225: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.45231: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.45233: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.45235: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.46529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.48117: done with get_vars() 27844 1726882774.48139: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:39:34 -0400 (0:00:00.111) 0:00:33.558 ****** 27844 1726882774.48225: entering _queue_task() for managed_node1/include_tasks 27844 1726882774.48562: worker is 1 (out of 1 available) 27844 1726882774.48576: exiting _queue_task() for managed_node1/include_tasks 27844 1726882774.48589: done queuing things up, now waiting for results queue to drain 27844 1726882774.48590: waiting for pending results... 27844 1726882774.48897: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 27844 1726882774.48998: in run() - task 0e448fcc-3ce9-efa9-466a-000000000816 27844 1726882774.49013: variable 'ansible_search_path' from source: unknown 27844 1726882774.49017: variable 'ansible_search_path' from source: unknown 27844 1726882774.49054: calling self._execute() 27844 1726882774.49155: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.49160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.49171: variable 'omit' from source: magic vars 27844 1726882774.49568: variable 'ansible_distribution_major_version' from source: facts 27844 1726882774.49589: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882774.49602: _execute() done 27844 1726882774.49612: dumping result to json 27844 1726882774.49622: done dumping result, returning 27844 1726882774.49630: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-efa9-466a-000000000816] 27844 1726882774.49638: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000816 27844 1726882774.49742: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000816 27844 1726882774.49750: WORKER PROCESS EXITING 27844 1726882774.49793: no more pending results, returning what we have 27844 1726882774.49799: in VariableManager get_vars() 27844 1726882774.49844: Calling all_inventory to load vars for managed_node1 27844 1726882774.49847: Calling groups_inventory to load vars for managed_node1 27844 1726882774.49849: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.49862: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.49867: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.49870: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.51552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.52949: done with get_vars() 27844 1726882774.52962: variable 'ansible_search_path' from source: unknown 27844 1726882774.52963: variable 'ansible_search_path' from source: unknown 27844 1726882774.52990: we have included files to process 27844 1726882774.52990: generating all_blocks data 27844 1726882774.52992: done generating all_blocks data 27844 1726882774.52992: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882774.52993: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882774.52994: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882774.53121: done processing included file 27844 1726882774.53123: iterating over new_blocks loaded from include file 27844 1726882774.53124: in VariableManager get_vars() 27844 1726882774.53136: done with get_vars() 27844 1726882774.53137: filtering new block on tags 27844 1726882774.53152: done filtering new block on tags 27844 1726882774.53154: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 27844 1726882774.53157: extending task lists for all hosts with included blocks 27844 1726882774.53231: done extending task lists 27844 1726882774.53232: done processing included files 27844 1726882774.53233: results queue empty 27844 1726882774.53233: checking for any_errors_fatal 27844 1726882774.53235: done checking for any_errors_fatal 27844 1726882774.53236: checking for max_fail_percentage 27844 1726882774.53236: done checking for max_fail_percentage 27844 1726882774.53237: checking to see if all hosts have failed and the running result is not ok 27844 1726882774.53238: done checking to see if all hosts have failed 27844 1726882774.53238: getting the remaining hosts for this loop 27844 1726882774.53239: done getting the remaining hosts for this loop 27844 1726882774.53240: getting the next task for host managed_node1 27844 1726882774.53243: done getting next task for host managed_node1 27844 1726882774.53244: ^ task is: TASK: Get stat for interface {{ interface }} 27844 1726882774.53247: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882774.53249: getting variables 27844 1726882774.53249: in VariableManager get_vars() 27844 1726882774.53258: Calling all_inventory to load vars for managed_node1 27844 1726882774.53260: Calling groups_inventory to load vars for managed_node1 27844 1726882774.53261: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.53267: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.53269: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.53271: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.54385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.56023: done with get_vars() 27844 1726882774.56037: done getting variables 27844 1726882774.56180: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:34 -0400 (0:00:00.079) 0:00:33.638 ****** 27844 1726882774.56203: entering _queue_task() for managed_node1/stat 27844 1726882774.56435: worker is 1 (out of 1 available) 27844 1726882774.56448: exiting _queue_task() for managed_node1/stat 27844 1726882774.56460: done queuing things up, now waiting for results queue to drain 27844 1726882774.56462: waiting for pending results... 27844 1726882774.56648: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest1 27844 1726882774.56727: in run() - task 0e448fcc-3ce9-efa9-466a-0000000008bc 27844 1726882774.56739: variable 'ansible_search_path' from source: unknown 27844 1726882774.56743: variable 'ansible_search_path' from source: unknown 27844 1726882774.56773: calling self._execute() 27844 1726882774.56853: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.56856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.56868: variable 'omit' from source: magic vars 27844 1726882774.57132: variable 'ansible_distribution_major_version' from source: facts 27844 1726882774.57142: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882774.57149: variable 'omit' from source: magic vars 27844 1726882774.57186: variable 'omit' from source: magic vars 27844 1726882774.57252: variable 'interface' from source: set_fact 27844 1726882774.57269: variable 'omit' from source: magic vars 27844 1726882774.57299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882774.57328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882774.57342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882774.57355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882774.57368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882774.57388: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882774.57391: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.57394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.57462: Set connection var ansible_shell_type to sh 27844 1726882774.57469: Set connection var ansible_connection to ssh 27844 1726882774.57472: Set connection var ansible_pipelining to False 27844 1726882774.57477: Set connection var ansible_timeout to 10 27844 1726882774.57482: Set connection var ansible_shell_executable to /bin/sh 27844 1726882774.57487: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882774.57508: variable 'ansible_shell_executable' from source: unknown 27844 1726882774.57511: variable 'ansible_connection' from source: unknown 27844 1726882774.57514: variable 'ansible_module_compression' from source: unknown 27844 1726882774.57516: variable 'ansible_shell_type' from source: unknown 27844 1726882774.57519: variable 'ansible_shell_executable' from source: unknown 27844 1726882774.57521: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.57524: variable 'ansible_pipelining' from source: unknown 27844 1726882774.57526: variable 'ansible_timeout' from source: unknown 27844 1726882774.57529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.57692: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882774.57701: variable 'omit' from source: magic vars 27844 1726882774.57705: starting attempt loop 27844 1726882774.57708: running the handler 27844 1726882774.57720: _low_level_execute_command(): starting 27844 1726882774.57727: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882774.58413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.58424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.58434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.58448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.58488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.58494: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.58504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.58516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.58524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.58531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.58539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.58548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.58559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.58573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.58579: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.58591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.58660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.58682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.58703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.58829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.60476: stdout chunk (state=3): >>>/root <<< 27844 1726882774.60575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.60620: stderr chunk (state=3): >>><<< 27844 1726882774.60626: stdout chunk (state=3): >>><<< 27844 1726882774.60648: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.60657: _low_level_execute_command(): starting 27844 1726882774.60662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694 `" && echo ansible-tmp-1726882774.6064603-29370-27720482018694="` echo /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694 `" ) && sleep 0' 27844 1726882774.61055: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.61070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.61078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.61087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.61115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.61122: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.61130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.61140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.61145: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.61154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.61161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.61171: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.61174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.61224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.61245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.61252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.61343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.63194: stdout chunk (state=3): >>>ansible-tmp-1726882774.6064603-29370-27720482018694=/root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694 <<< 27844 1726882774.63300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.63348: stderr chunk (state=3): >>><<< 27844 1726882774.63352: stdout chunk (state=3): >>><<< 27844 1726882774.63368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882774.6064603-29370-27720482018694=/root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.63398: variable 'ansible_module_compression' from source: unknown 27844 1726882774.63446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882774.63479: variable 'ansible_facts' from source: unknown 27844 1726882774.63525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/AnsiballZ_stat.py 27844 1726882774.63620: Sending initial data 27844 1726882774.63623: Sent initial data (152 bytes) 27844 1726882774.64249: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.64253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.64290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.64295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882774.64298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.64345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.64349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.64447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.66152: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 27844 1726882774.66156: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882774.66242: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882774.66336: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpg0b9j3kv /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/AnsiballZ_stat.py <<< 27844 1726882774.66425: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882774.67401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.67485: stderr chunk (state=3): >>><<< 27844 1726882774.67488: stdout chunk (state=3): >>><<< 27844 1726882774.67502: done transferring module to remote 27844 1726882774.67510: _low_level_execute_command(): starting 27844 1726882774.67514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/ /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/AnsiballZ_stat.py && sleep 0' 27844 1726882774.67911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.67916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.67945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.67960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.68018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.68024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.68128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.69836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.69886: stderr chunk (state=3): >>><<< 27844 1726882774.69892: stdout chunk (state=3): >>><<< 27844 1726882774.69908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.69911: _low_level_execute_command(): starting 27844 1726882774.69914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/AnsiballZ_stat.py && sleep 0' 27844 1726882774.70342: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.70348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.70384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882774.70389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.70397: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.70403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.70408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.70419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882774.70429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.70481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.70493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.70599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.83684: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882774.84706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882774.84710: stdout chunk (state=3): >>><<< 27844 1726882774.84712: stderr chunk (state=3): >>><<< 27844 1726882774.84769: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882774.84774: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882774.84856: _low_level_execute_command(): starting 27844 1726882774.84860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882774.6064603-29370-27720482018694/ > /dev/null 2>&1 && sleep 0' 27844 1726882774.85440: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882774.85454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.85470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.85491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.85538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.85550: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882774.85570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.85587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882774.85598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882774.85610: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882774.85625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882774.85637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882774.85651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882774.85662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882774.85675: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882774.85687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882774.85770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882774.85792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882774.85806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882774.85924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882774.87710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882774.87779: stderr chunk (state=3): >>><<< 27844 1726882774.87790: stdout chunk (state=3): >>><<< 27844 1726882774.87973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882774.87977: handler run complete 27844 1726882774.87979: attempt loop complete, returning result 27844 1726882774.87981: _execute() done 27844 1726882774.87983: dumping result to json 27844 1726882774.87985: done dumping result, returning 27844 1726882774.87987: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest1 [0e448fcc-3ce9-efa9-466a-0000000008bc] 27844 1726882774.87989: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000008bc 27844 1726882774.88065: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000008bc 27844 1726882774.88069: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 27844 1726882774.88340: no more pending results, returning what we have 27844 1726882774.88344: results queue empty 27844 1726882774.88345: checking for any_errors_fatal 27844 1726882774.88346: done checking for any_errors_fatal 27844 1726882774.88347: checking for max_fail_percentage 27844 1726882774.88348: done checking for max_fail_percentage 27844 1726882774.88349: checking to see if all hosts have failed and the running result is not ok 27844 1726882774.88350: done checking to see if all hosts have failed 27844 1726882774.88351: getting the remaining hosts for this loop 27844 1726882774.88352: done getting the remaining hosts for this loop 27844 1726882774.88355: getting the next task for host managed_node1 27844 1726882774.88362: done getting next task for host managed_node1 27844 1726882774.88370: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27844 1726882774.88375: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882774.88379: getting variables 27844 1726882774.88381: in VariableManager get_vars() 27844 1726882774.88419: Calling all_inventory to load vars for managed_node1 27844 1726882774.88421: Calling groups_inventory to load vars for managed_node1 27844 1726882774.88424: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.88434: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.88437: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.88441: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.89877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.91638: done with get_vars() 27844 1726882774.91660: done getting variables 27844 1726882774.91725: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882774.91844: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:39:34 -0400 (0:00:00.356) 0:00:33.995 ****** 27844 1726882774.91881: entering _queue_task() for managed_node1/assert 27844 1726882774.92148: worker is 1 (out of 1 available) 27844 1726882774.92159: exiting _queue_task() for managed_node1/assert 27844 1726882774.92173: done queuing things up, now waiting for results queue to drain 27844 1726882774.92175: waiting for pending results... 27844 1726882774.92459: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest1' 27844 1726882774.92578: in run() - task 0e448fcc-3ce9-efa9-466a-000000000817 27844 1726882774.92599: variable 'ansible_search_path' from source: unknown 27844 1726882774.92606: variable 'ansible_search_path' from source: unknown 27844 1726882774.92648: calling self._execute() 27844 1726882774.92755: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.92768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.92782: variable 'omit' from source: magic vars 27844 1726882774.93145: variable 'ansible_distribution_major_version' from source: facts 27844 1726882774.93168: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882774.93181: variable 'omit' from source: magic vars 27844 1726882774.93230: variable 'omit' from source: magic vars 27844 1726882774.93335: variable 'interface' from source: set_fact 27844 1726882774.93365: variable 'omit' from source: magic vars 27844 1726882774.93411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882774.93452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882774.93487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882774.93509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882774.93526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882774.93559: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882774.93585: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.93595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.93690: Set connection var ansible_shell_type to sh 27844 1726882774.93700: Set connection var ansible_connection to ssh 27844 1726882774.93709: Set connection var ansible_pipelining to False 27844 1726882774.93717: Set connection var ansible_timeout to 10 27844 1726882774.93724: Set connection var ansible_shell_executable to /bin/sh 27844 1726882774.93731: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882774.93756: variable 'ansible_shell_executable' from source: unknown 27844 1726882774.93767: variable 'ansible_connection' from source: unknown 27844 1726882774.93775: variable 'ansible_module_compression' from source: unknown 27844 1726882774.93784: variable 'ansible_shell_type' from source: unknown 27844 1726882774.93795: variable 'ansible_shell_executable' from source: unknown 27844 1726882774.93803: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.93813: variable 'ansible_pipelining' from source: unknown 27844 1726882774.93820: variable 'ansible_timeout' from source: unknown 27844 1726882774.93827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.93977: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882774.93996: variable 'omit' from source: magic vars 27844 1726882774.94011: starting attempt loop 27844 1726882774.94018: running the handler 27844 1726882774.94180: variable 'interface_stat' from source: set_fact 27844 1726882774.94196: Evaluated conditional (not interface_stat.stat.exists): True 27844 1726882774.94206: handler run complete 27844 1726882774.94230: attempt loop complete, returning result 27844 1726882774.94238: _execute() done 27844 1726882774.94247: dumping result to json 27844 1726882774.94254: done dumping result, returning 27844 1726882774.94267: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest1' [0e448fcc-3ce9-efa9-466a-000000000817] 27844 1726882774.94278: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000817 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882774.94429: no more pending results, returning what we have 27844 1726882774.94433: results queue empty 27844 1726882774.94434: checking for any_errors_fatal 27844 1726882774.94445: done checking for any_errors_fatal 27844 1726882774.94446: checking for max_fail_percentage 27844 1726882774.94447: done checking for max_fail_percentage 27844 1726882774.94449: checking to see if all hosts have failed and the running result is not ok 27844 1726882774.94450: done checking to see if all hosts have failed 27844 1726882774.94450: getting the remaining hosts for this loop 27844 1726882774.94452: done getting the remaining hosts for this loop 27844 1726882774.94456: getting the next task for host managed_node1 27844 1726882774.94466: done getting next task for host managed_node1 27844 1726882774.94468: ^ task is: TASK: Set interface0 27844 1726882774.94472: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882774.94478: getting variables 27844 1726882774.94480: in VariableManager get_vars() 27844 1726882774.94527: Calling all_inventory to load vars for managed_node1 27844 1726882774.94530: Calling groups_inventory to load vars for managed_node1 27844 1726882774.94533: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882774.94546: Calling all_plugins_play to load vars for managed_node1 27844 1726882774.94550: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882774.94553: Calling groups_plugins_play to load vars for managed_node1 27844 1726882774.95603: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000817 27844 1726882774.95607: WORKER PROCESS EXITING 27844 1726882774.96458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882774.98172: done with get_vars() 27844 1726882774.98193: done getting variables 27844 1726882774.98256: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Friday 20 September 2024 21:39:34 -0400 (0:00:00.064) 0:00:34.059 ****** 27844 1726882774.98289: entering _queue_task() for managed_node1/set_fact 27844 1726882774.98576: worker is 1 (out of 1 available) 27844 1726882774.98626: exiting _queue_task() for managed_node1/set_fact 27844 1726882774.98726: done queuing things up, now waiting for results queue to drain 27844 1726882774.98739: waiting for pending results... 27844 1726882774.99175: running TaskExecutor() for managed_node1/TASK: Set interface0 27844 1726882774.99278: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b7 27844 1726882774.99301: variable 'ansible_search_path' from source: unknown 27844 1726882774.99341: calling self._execute() 27844 1726882774.99449: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882774.99459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882774.99476: variable 'omit' from source: magic vars 27844 1726882774.99853: variable 'ansible_distribution_major_version' from source: facts 27844 1726882774.99871: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882774.99882: variable 'omit' from source: magic vars 27844 1726882774.99925: variable 'omit' from source: magic vars 27844 1726882774.99958: variable 'interface0' from source: play vars 27844 1726882775.00033: variable 'interface0' from source: play vars 27844 1726882775.00056: variable 'omit' from source: magic vars 27844 1726882775.00101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882775.00146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882775.00174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882775.00195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882775.00209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882775.00240: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882775.00252: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.00259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.00403: Set connection var ansible_shell_type to sh 27844 1726882775.00410: Set connection var ansible_connection to ssh 27844 1726882775.00418: Set connection var ansible_pipelining to False 27844 1726882775.00428: Set connection var ansible_timeout to 10 27844 1726882775.00437: Set connection var ansible_shell_executable to /bin/sh 27844 1726882775.00446: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882775.00487: variable 'ansible_shell_executable' from source: unknown 27844 1726882775.00499: variable 'ansible_connection' from source: unknown 27844 1726882775.00505: variable 'ansible_module_compression' from source: unknown 27844 1726882775.00511: variable 'ansible_shell_type' from source: unknown 27844 1726882775.00516: variable 'ansible_shell_executable' from source: unknown 27844 1726882775.00521: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.00528: variable 'ansible_pipelining' from source: unknown 27844 1726882775.00533: variable 'ansible_timeout' from source: unknown 27844 1726882775.00539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.00682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882775.00702: variable 'omit' from source: magic vars 27844 1726882775.00715: starting attempt loop 27844 1726882775.00722: running the handler 27844 1726882775.00736: handler run complete 27844 1726882775.00752: attempt loop complete, returning result 27844 1726882775.00758: _execute() done 27844 1726882775.00763: dumping result to json 27844 1726882775.00772: done dumping result, returning 27844 1726882775.00783: done running TaskExecutor() for managed_node1/TASK: Set interface0 [0e448fcc-3ce9-efa9-466a-0000000000b7] 27844 1726882775.00792: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b7 27844 1726882775.00886: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b7 27844 1726882775.00893: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 27844 1726882775.00975: no more pending results, returning what we have 27844 1726882775.00979: results queue empty 27844 1726882775.00980: checking for any_errors_fatal 27844 1726882775.00989: done checking for any_errors_fatal 27844 1726882775.00989: checking for max_fail_percentage 27844 1726882775.00991: done checking for max_fail_percentage 27844 1726882775.00992: checking to see if all hosts have failed and the running result is not ok 27844 1726882775.00993: done checking to see if all hosts have failed 27844 1726882775.00994: getting the remaining hosts for this loop 27844 1726882775.00995: done getting the remaining hosts for this loop 27844 1726882775.00999: getting the next task for host managed_node1 27844 1726882775.01006: done getting next task for host managed_node1 27844 1726882775.01009: ^ task is: TASK: Delete interface0 27844 1726882775.01013: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882775.01016: getting variables 27844 1726882775.01018: in VariableManager get_vars() 27844 1726882775.01059: Calling all_inventory to load vars for managed_node1 27844 1726882775.01062: Calling groups_inventory to load vars for managed_node1 27844 1726882775.01067: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.01080: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.01083: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.01086: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.02724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.04433: done with get_vars() 27844 1726882775.04455: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Friday 20 September 2024 21:39:35 -0400 (0:00:00.062) 0:00:34.122 ****** 27844 1726882775.04551: entering _queue_task() for managed_node1/include_tasks 27844 1726882775.04811: worker is 1 (out of 1 available) 27844 1726882775.04824: exiting _queue_task() for managed_node1/include_tasks 27844 1726882775.04836: done queuing things up, now waiting for results queue to drain 27844 1726882775.04838: waiting for pending results... 27844 1726882775.05117: running TaskExecutor() for managed_node1/TASK: Delete interface0 27844 1726882775.05225: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b8 27844 1726882775.05243: variable 'ansible_search_path' from source: unknown 27844 1726882775.05289: calling self._execute() 27844 1726882775.05385: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.05398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.05413: variable 'omit' from source: magic vars 27844 1726882775.05775: variable 'ansible_distribution_major_version' from source: facts 27844 1726882775.05794: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882775.05805: _execute() done 27844 1726882775.05813: dumping result to json 27844 1726882775.05822: done dumping result, returning 27844 1726882775.05835: done running TaskExecutor() for managed_node1/TASK: Delete interface0 [0e448fcc-3ce9-efa9-466a-0000000000b8] 27844 1726882775.05845: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b8 27844 1726882775.05961: no more pending results, returning what we have 27844 1726882775.05968: in VariableManager get_vars() 27844 1726882775.06014: Calling all_inventory to load vars for managed_node1 27844 1726882775.06017: Calling groups_inventory to load vars for managed_node1 27844 1726882775.06019: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.06033: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.06036: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.06039: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.07082: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b8 27844 1726882775.07085: WORKER PROCESS EXITING 27844 1726882775.08267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.10707: done with get_vars() 27844 1726882775.10728: variable 'ansible_search_path' from source: unknown 27844 1726882775.10741: we have included files to process 27844 1726882775.10742: generating all_blocks data 27844 1726882775.10744: done generating all_blocks data 27844 1726882775.10748: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882775.10749: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882775.10752: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27844 1726882775.10928: done processing included file 27844 1726882775.10930: iterating over new_blocks loaded from include file 27844 1726882775.10931: in VariableManager get_vars() 27844 1726882775.10950: done with get_vars() 27844 1726882775.10952: filtering new block on tags 27844 1726882775.10978: done filtering new block on tags 27844 1726882775.10980: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 27844 1726882775.10985: extending task lists for all hosts with included blocks 27844 1726882775.14634: done extending task lists 27844 1726882775.14635: done processing included files 27844 1726882775.14636: results queue empty 27844 1726882775.14637: checking for any_errors_fatal 27844 1726882775.14640: done checking for any_errors_fatal 27844 1726882775.14641: checking for max_fail_percentage 27844 1726882775.14642: done checking for max_fail_percentage 27844 1726882775.14643: checking to see if all hosts have failed and the running result is not ok 27844 1726882775.14644: done checking to see if all hosts have failed 27844 1726882775.14645: getting the remaining hosts for this loop 27844 1726882775.14646: done getting the remaining hosts for this loop 27844 1726882775.14649: getting the next task for host managed_node1 27844 1726882775.14653: done getting next task for host managed_node1 27844 1726882775.14655: ^ task is: TASK: Remove test interface if necessary 27844 1726882775.14658: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882775.14661: getting variables 27844 1726882775.14662: in VariableManager get_vars() 27844 1726882775.14682: Calling all_inventory to load vars for managed_node1 27844 1726882775.14685: Calling groups_inventory to load vars for managed_node1 27844 1726882775.14687: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.14693: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.14696: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.14699: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.17446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.20496: done with get_vars() 27844 1726882775.20525: done getting variables 27844 1726882775.20572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:39:35 -0400 (0:00:00.160) 0:00:34.282 ****** 27844 1726882775.20601: entering _queue_task() for managed_node1/command 27844 1726882775.20922: worker is 1 (out of 1 available) 27844 1726882775.20935: exiting _queue_task() for managed_node1/command 27844 1726882775.20946: done queuing things up, now waiting for results queue to drain 27844 1726882775.20948: waiting for pending results... 27844 1726882775.21900: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 27844 1726882775.22021: in run() - task 0e448fcc-3ce9-efa9-466a-0000000008da 27844 1726882775.22060: variable 'ansible_search_path' from source: unknown 27844 1726882775.22074: variable 'ansible_search_path' from source: unknown 27844 1726882775.22119: calling self._execute() 27844 1726882775.22224: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.22235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.22249: variable 'omit' from source: magic vars 27844 1726882775.22623: variable 'ansible_distribution_major_version' from source: facts 27844 1726882775.22642: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882775.22656: variable 'omit' from source: magic vars 27844 1726882775.22702: variable 'omit' from source: magic vars 27844 1726882775.22789: variable 'interface' from source: set_fact 27844 1726882775.22814: variable 'omit' from source: magic vars 27844 1726882775.22860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882775.22899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882775.22920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882775.22944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882775.22958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882775.22993: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882775.23003: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.23010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.23112: Set connection var ansible_shell_type to sh 27844 1726882775.23119: Set connection var ansible_connection to ssh 27844 1726882775.23129: Set connection var ansible_pipelining to False 27844 1726882775.23138: Set connection var ansible_timeout to 10 27844 1726882775.23151: Set connection var ansible_shell_executable to /bin/sh 27844 1726882775.23159: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882775.23191: variable 'ansible_shell_executable' from source: unknown 27844 1726882775.23200: variable 'ansible_connection' from source: unknown 27844 1726882775.23206: variable 'ansible_module_compression' from source: unknown 27844 1726882775.23213: variable 'ansible_shell_type' from source: unknown 27844 1726882775.23220: variable 'ansible_shell_executable' from source: unknown 27844 1726882775.23226: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.23233: variable 'ansible_pipelining' from source: unknown 27844 1726882775.23239: variable 'ansible_timeout' from source: unknown 27844 1726882775.23247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.23393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882775.23408: variable 'omit' from source: magic vars 27844 1726882775.23416: starting attempt loop 27844 1726882775.23422: running the handler 27844 1726882775.23439: _low_level_execute_command(): starting 27844 1726882775.23450: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882775.24612: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882775.24626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.24640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.24656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.24705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.24719: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882775.24734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.24753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882775.24768: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882775.24781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882775.24797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.24810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.24825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.24839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.24852: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882775.24870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.24949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882775.24976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.24996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.25132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.26788: stdout chunk (state=3): >>>/root <<< 27844 1726882775.26953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882775.26956: stdout chunk (state=3): >>><<< 27844 1726882775.26958: stderr chunk (state=3): >>><<< 27844 1726882775.27066: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882775.27070: _low_level_execute_command(): starting 27844 1726882775.27073: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568 `" && echo ansible-tmp-1726882775.2697818-29396-22676086519568="` echo /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568 `" ) && sleep 0' 27844 1726882775.27623: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882775.27638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.27653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.27676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.27720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.27733: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882775.27747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.27768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882775.27782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882775.27793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882775.27805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.27819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.27839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.27852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.27865: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882775.27881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.27959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882775.27978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.27994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.28122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.29971: stdout chunk (state=3): >>>ansible-tmp-1726882775.2697818-29396-22676086519568=/root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568 <<< 27844 1726882775.30077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882775.30149: stderr chunk (state=3): >>><<< 27844 1726882775.30153: stdout chunk (state=3): >>><<< 27844 1726882775.30452: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882775.2697818-29396-22676086519568=/root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882775.30456: variable 'ansible_module_compression' from source: unknown 27844 1726882775.30458: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882775.30460: variable 'ansible_facts' from source: unknown 27844 1726882775.30462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/AnsiballZ_command.py 27844 1726882775.30920: Sending initial data 27844 1726882775.30923: Sent initial data (155 bytes) 27844 1726882775.31851: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.31855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.31892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.31895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.31898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.31962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.31984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.32105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.33823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882775.33922: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882775.34014: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpvhymvut2 /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/AnsiballZ_command.py <<< 27844 1726882775.34106: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882775.35363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882775.35438: stderr chunk (state=3): >>><<< 27844 1726882775.35442: stdout chunk (state=3): >>><<< 27844 1726882775.35462: done transferring module to remote 27844 1726882775.35479: _low_level_execute_command(): starting 27844 1726882775.35484: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/ /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/AnsiballZ_command.py && sleep 0' 27844 1726882775.36087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882775.36097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.36106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.36119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.36154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.36161: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882775.36176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.36188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882775.36195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882775.36201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882775.36210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.36218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.36229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.36236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.36243: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882775.36251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.36326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882775.36339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.36350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.36467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.38230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882775.38253: stderr chunk (state=3): >>><<< 27844 1726882775.38256: stdout chunk (state=3): >>><<< 27844 1726882775.38344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882775.38348: _low_level_execute_command(): starting 27844 1726882775.38350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/AnsiballZ_command.py && sleep 0' 27844 1726882775.38899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882775.38914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.38929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.38948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.38991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.39003: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882775.39018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.39035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882775.39047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882775.39059: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882775.39076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.39091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.39106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.39117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.39128: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882775.39142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.39219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882775.39236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.39251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.39511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.53089: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:39:35.521806", "end": "2024-09-20 21:39:35.529354", "delta": "0:00:00.007548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882775.54484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882775.54559: stderr chunk (state=3): >>><<< 27844 1726882775.54562: stdout chunk (state=3): >>><<< 27844 1726882775.54670: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:39:35.521806", "end": "2024-09-20 21:39:35.529354", "delta": "0:00:00.007548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882775.54679: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882775.54682: _low_level_execute_command(): starting 27844 1726882775.54684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882775.2697818-29396-22676086519568/ > /dev/null 2>&1 && sleep 0' 27844 1726882775.55917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882775.56581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.56597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.56616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.56661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.56679: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882775.56695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.56714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882775.56727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882775.56738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882775.56750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882775.56768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882775.56785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882775.56797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882775.56809: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882775.56823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882775.56901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882775.56925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882775.56943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882775.57074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882775.58973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882775.58976: stdout chunk (state=3): >>><<< 27844 1726882775.58978: stderr chunk (state=3): >>><<< 27844 1726882775.59272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882775.59276: handler run complete 27844 1726882775.59278: Evaluated conditional (False): False 27844 1726882775.59280: attempt loop complete, returning result 27844 1726882775.59282: _execute() done 27844 1726882775.59284: dumping result to json 27844 1726882775.59286: done dumping result, returning 27844 1726882775.59288: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0e448fcc-3ce9-efa9-466a-0000000008da] 27844 1726882775.59290: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000008da 27844 1726882775.59362: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000008da 27844 1726882775.59367: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.007548", "end": "2024-09-20 21:39:35.529354", "rc": 0, "start": "2024-09-20 21:39:35.521806" } 27844 1726882775.59424: no more pending results, returning what we have 27844 1726882775.59428: results queue empty 27844 1726882775.59429: checking for any_errors_fatal 27844 1726882775.59430: done checking for any_errors_fatal 27844 1726882775.59431: checking for max_fail_percentage 27844 1726882775.59432: done checking for max_fail_percentage 27844 1726882775.59433: checking to see if all hosts have failed and the running result is not ok 27844 1726882775.59434: done checking to see if all hosts have failed 27844 1726882775.59435: getting the remaining hosts for this loop 27844 1726882775.59436: done getting the remaining hosts for this loop 27844 1726882775.59439: getting the next task for host managed_node1 27844 1726882775.59445: done getting next task for host managed_node1 27844 1726882775.59448: ^ task is: TASK: Assert interface0 is absent 27844 1726882775.59458: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882775.59462: getting variables 27844 1726882775.59465: in VariableManager get_vars() 27844 1726882775.59508: Calling all_inventory to load vars for managed_node1 27844 1726882775.59511: Calling groups_inventory to load vars for managed_node1 27844 1726882775.59514: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.59524: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.59527: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.59530: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.62130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.67052: done with get_vars() 27844 1726882775.67093: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Friday 20 September 2024 21:39:35 -0400 (0:00:00.466) 0:00:34.748 ****** 27844 1726882775.67206: entering _queue_task() for managed_node1/include_tasks 27844 1726882775.67550: worker is 1 (out of 1 available) 27844 1726882775.67769: exiting _queue_task() for managed_node1/include_tasks 27844 1726882775.67784: done queuing things up, now waiting for results queue to drain 27844 1726882775.67786: waiting for pending results... 27844 1726882775.68540: running TaskExecutor() for managed_node1/TASK: Assert interface0 is absent 27844 1726882775.68655: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000b9 27844 1726882775.68788: variable 'ansible_search_path' from source: unknown 27844 1726882775.68827: calling self._execute() 27844 1726882775.68973: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.69079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.69101: variable 'omit' from source: magic vars 27844 1726882775.69991: variable 'ansible_distribution_major_version' from source: facts 27844 1726882775.70011: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882775.70023: _execute() done 27844 1726882775.70032: dumping result to json 27844 1726882775.70040: done dumping result, returning 27844 1726882775.70052: done running TaskExecutor() for managed_node1/TASK: Assert interface0 is absent [0e448fcc-3ce9-efa9-466a-0000000000b9] 27844 1726882775.70062: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b9 27844 1726882775.70201: no more pending results, returning what we have 27844 1726882775.70207: in VariableManager get_vars() 27844 1726882775.70260: Calling all_inventory to load vars for managed_node1 27844 1726882775.70265: Calling groups_inventory to load vars for managed_node1 27844 1726882775.70270: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.70288: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.70292: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.70297: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.70817: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000b9 27844 1726882775.70821: WORKER PROCESS EXITING 27844 1726882775.73186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.76700: done with get_vars() 27844 1726882775.76724: variable 'ansible_search_path' from source: unknown 27844 1726882775.76854: we have included files to process 27844 1726882775.76855: generating all_blocks data 27844 1726882775.76857: done generating all_blocks data 27844 1726882775.76860: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882775.76861: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882775.76868: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27844 1726882775.77098: in VariableManager get_vars() 27844 1726882775.77122: done with get_vars() 27844 1726882775.77318: done processing included file 27844 1726882775.77320: iterating over new_blocks loaded from include file 27844 1726882775.77322: in VariableManager get_vars() 27844 1726882775.77339: done with get_vars() 27844 1726882775.77341: filtering new block on tags 27844 1726882775.77379: done filtering new block on tags 27844 1726882775.77382: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 27844 1726882775.77513: extending task lists for all hosts with included blocks 27844 1726882775.81199: done extending task lists 27844 1726882775.81200: done processing included files 27844 1726882775.81201: results queue empty 27844 1726882775.81202: checking for any_errors_fatal 27844 1726882775.81207: done checking for any_errors_fatal 27844 1726882775.81207: checking for max_fail_percentage 27844 1726882775.81208: done checking for max_fail_percentage 27844 1726882775.81209: checking to see if all hosts have failed and the running result is not ok 27844 1726882775.81210: done checking to see if all hosts have failed 27844 1726882775.81211: getting the remaining hosts for this loop 27844 1726882775.81212: done getting the remaining hosts for this loop 27844 1726882775.81215: getting the next task for host managed_node1 27844 1726882775.81219: done getting next task for host managed_node1 27844 1726882775.81221: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27844 1726882775.81224: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882775.81226: getting variables 27844 1726882775.81227: in VariableManager get_vars() 27844 1726882775.81241: Calling all_inventory to load vars for managed_node1 27844 1726882775.81243: Calling groups_inventory to load vars for managed_node1 27844 1726882775.81245: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.81251: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.81253: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.81256: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.84655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882775.94146: done with get_vars() 27844 1726882775.94175: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:39:35 -0400 (0:00:00.271) 0:00:35.020 ****** 27844 1726882775.94370: entering _queue_task() for managed_node1/include_tasks 27844 1726882775.94918: worker is 1 (out of 1 available) 27844 1726882775.94930: exiting _queue_task() for managed_node1/include_tasks 27844 1726882775.94943: done queuing things up, now waiting for results queue to drain 27844 1726882775.94945: waiting for pending results... 27844 1726882775.95249: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 27844 1726882775.95355: in run() - task 0e448fcc-3ce9-efa9-466a-000000000990 27844 1726882775.95366: variable 'ansible_search_path' from source: unknown 27844 1726882775.95372: variable 'ansible_search_path' from source: unknown 27844 1726882775.95412: calling self._execute() 27844 1726882775.95520: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882775.95533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882775.95544: variable 'omit' from source: magic vars 27844 1726882775.95995: variable 'ansible_distribution_major_version' from source: facts 27844 1726882775.96008: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882775.96015: _execute() done 27844 1726882775.96018: dumping result to json 27844 1726882775.96021: done dumping result, returning 27844 1726882775.96027: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-efa9-466a-000000000990] 27844 1726882775.96033: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000990 27844 1726882775.96131: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000990 27844 1726882775.96133: WORKER PROCESS EXITING 27844 1726882775.96178: no more pending results, returning what we have 27844 1726882775.96184: in VariableManager get_vars() 27844 1726882775.96232: Calling all_inventory to load vars for managed_node1 27844 1726882775.96235: Calling groups_inventory to load vars for managed_node1 27844 1726882775.96238: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882775.96253: Calling all_plugins_play to load vars for managed_node1 27844 1726882775.96256: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882775.96259: Calling groups_plugins_play to load vars for managed_node1 27844 1726882775.98262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.00233: done with get_vars() 27844 1726882776.00263: variable 'ansible_search_path' from source: unknown 27844 1726882776.00268: variable 'ansible_search_path' from source: unknown 27844 1726882776.00313: we have included files to process 27844 1726882776.00314: generating all_blocks data 27844 1726882776.00316: done generating all_blocks data 27844 1726882776.00317: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882776.00319: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882776.00321: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27844 1726882776.00545: done processing included file 27844 1726882776.00547: iterating over new_blocks loaded from include file 27844 1726882776.00549: in VariableManager get_vars() 27844 1726882776.00573: done with get_vars() 27844 1726882776.00575: filtering new block on tags 27844 1726882776.00603: done filtering new block on tags 27844 1726882776.00605: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 27844 1726882776.00611: extending task lists for all hosts with included blocks 27844 1726882776.00753: done extending task lists 27844 1726882776.00755: done processing included files 27844 1726882776.00756: results queue empty 27844 1726882776.00756: checking for any_errors_fatal 27844 1726882776.00759: done checking for any_errors_fatal 27844 1726882776.00760: checking for max_fail_percentage 27844 1726882776.00774: done checking for max_fail_percentage 27844 1726882776.00775: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.00776: done checking to see if all hosts have failed 27844 1726882776.00777: getting the remaining hosts for this loop 27844 1726882776.00778: done getting the remaining hosts for this loop 27844 1726882776.00782: getting the next task for host managed_node1 27844 1726882776.00786: done getting next task for host managed_node1 27844 1726882776.00788: ^ task is: TASK: Get stat for interface {{ interface }} 27844 1726882776.00793: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.00796: getting variables 27844 1726882776.00797: in VariableManager get_vars() 27844 1726882776.00811: Calling all_inventory to load vars for managed_node1 27844 1726882776.00813: Calling groups_inventory to load vars for managed_node1 27844 1726882776.00815: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.00820: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.00822: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.00825: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.02416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.05077: done with get_vars() 27844 1726882776.05102: done getting variables 27844 1726882776.05261: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:36 -0400 (0:00:00.109) 0:00:35.129 ****** 27844 1726882776.05298: entering _queue_task() for managed_node1/stat 27844 1726882776.05624: worker is 1 (out of 1 available) 27844 1726882776.05636: exiting _queue_task() for managed_node1/stat 27844 1726882776.05648: done queuing things up, now waiting for results queue to drain 27844 1726882776.05650: waiting for pending results... 27844 1726882776.05957: running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 27844 1726882776.06101: in run() - task 0e448fcc-3ce9-efa9-466a-000000000a4d 27844 1726882776.06120: variable 'ansible_search_path' from source: unknown 27844 1726882776.06126: variable 'ansible_search_path' from source: unknown 27844 1726882776.06169: calling self._execute() 27844 1726882776.06272: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.06284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.06296: variable 'omit' from source: magic vars 27844 1726882776.06666: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.06685: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.06697: variable 'omit' from source: magic vars 27844 1726882776.06758: variable 'omit' from source: magic vars 27844 1726882776.06855: variable 'interface' from source: set_fact 27844 1726882776.06878: variable 'omit' from source: magic vars 27844 1726882776.06921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882776.06963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882776.06989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882776.07010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.07024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.07053: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882776.07064: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.07076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.07175: Set connection var ansible_shell_type to sh 27844 1726882776.07185: Set connection var ansible_connection to ssh 27844 1726882776.07195: Set connection var ansible_pipelining to False 27844 1726882776.07204: Set connection var ansible_timeout to 10 27844 1726882776.07213: Set connection var ansible_shell_executable to /bin/sh 27844 1726882776.07221: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882776.07248: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.07255: variable 'ansible_connection' from source: unknown 27844 1726882776.07261: variable 'ansible_module_compression' from source: unknown 27844 1726882776.07270: variable 'ansible_shell_type' from source: unknown 27844 1726882776.07277: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.07283: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.07293: variable 'ansible_pipelining' from source: unknown 27844 1726882776.07300: variable 'ansible_timeout' from source: unknown 27844 1726882776.07307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.07495: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882776.07515: variable 'omit' from source: magic vars 27844 1726882776.07524: starting attempt loop 27844 1726882776.07530: running the handler 27844 1726882776.07545: _low_level_execute_command(): starting 27844 1726882776.07556: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882776.09208: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.09247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.09267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.09287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.09331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.09520: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.09536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.09555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.09570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.09583: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.09596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.09610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.09629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.09644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.09658: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.09677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.09754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.09780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.09798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.09930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.11604: stdout chunk (state=3): >>>/root <<< 27844 1726882776.11768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.11772: stdout chunk (state=3): >>><<< 27844 1726882776.11790: stderr chunk (state=3): >>><<< 27844 1726882776.11894: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.11898: _low_level_execute_command(): starting 27844 1726882776.11901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344 `" && echo ansible-tmp-1726882776.1181018-29426-215645719431344="` echo /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344 `" ) && sleep 0' 27844 1726882776.12540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.12554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.12572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.12596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.12636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.12647: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.12659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.12679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.12694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.12704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.12715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.12727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.12740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.12750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.12760: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.12775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.12853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.12875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.12890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.13012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.14878: stdout chunk (state=3): >>>ansible-tmp-1726882776.1181018-29426-215645719431344=/root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344 <<< 27844 1726882776.15061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.15066: stdout chunk (state=3): >>><<< 27844 1726882776.15068: stderr chunk (state=3): >>><<< 27844 1726882776.15373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882776.1181018-29426-215645719431344=/root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.15377: variable 'ansible_module_compression' from source: unknown 27844 1726882776.15379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882776.15382: variable 'ansible_facts' from source: unknown 27844 1726882776.15384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/AnsiballZ_stat.py 27844 1726882776.15469: Sending initial data 27844 1726882776.15484: Sent initial data (153 bytes) 27844 1726882776.16459: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.16483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.16498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.16515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.16554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.16567: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.16586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.16603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.16614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.16626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.16637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.16650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.16666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.16684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.16698: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.16711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.16791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.16816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.16831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.16948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.18681: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882776.18773: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882776.18866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp6d0v3xhh /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/AnsiballZ_stat.py <<< 27844 1726882776.18953: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882776.20290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.20534: stderr chunk (state=3): >>><<< 27844 1726882776.20537: stdout chunk (state=3): >>><<< 27844 1726882776.20539: done transferring module to remote 27844 1726882776.20542: _low_level_execute_command(): starting 27844 1726882776.20547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/ /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/AnsiballZ_stat.py && sleep 0' 27844 1726882776.21134: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.21147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.21159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.21187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.21235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.21246: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.21261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.21281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.21294: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.21315: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.21329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.21344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.21359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.21376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.21387: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.21402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.21488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.21506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.21523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.21656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.23518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.23521: stdout chunk (state=3): >>><<< 27844 1726882776.23524: stderr chunk (state=3): >>><<< 27844 1726882776.23620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.23624: _low_level_execute_command(): starting 27844 1726882776.23626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/AnsiballZ_stat.py && sleep 0' 27844 1726882776.24694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.24702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.24713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.24726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.24762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.24774: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.24784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.24797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.24806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.24813: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.24818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.24827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.24838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.24845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.24851: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.24861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.24934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.24947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.24957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.25294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.38351: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882776.39357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882776.39361: stdout chunk (state=3): >>><<< 27844 1726882776.39368: stderr chunk (state=3): >>><<< 27844 1726882776.39388: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882776.39420: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882776.39427: _low_level_execute_command(): starting 27844 1726882776.39432: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882776.1181018-29426-215645719431344/ > /dev/null 2>&1 && sleep 0' 27844 1726882776.40050: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.40056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.40068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.40111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.40220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.40223: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.40225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.40227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.40229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.40231: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.40233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.40235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.40236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.40238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.40240: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882776.40242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.40319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.40322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.40324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.40591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.42485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.42517: stderr chunk (state=3): >>><<< 27844 1726882776.42522: stdout chunk (state=3): >>><<< 27844 1726882776.42538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.42543: handler run complete 27844 1726882776.42568: attempt loop complete, returning result 27844 1726882776.42574: _execute() done 27844 1726882776.42576: dumping result to json 27844 1726882776.42579: done dumping result, returning 27844 1726882776.42589: done running TaskExecutor() for managed_node1/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-efa9-466a-000000000a4d] 27844 1726882776.42593: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a4d 27844 1726882776.42702: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a4d 27844 1726882776.42705: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 27844 1726882776.42767: no more pending results, returning what we have 27844 1726882776.42771: results queue empty 27844 1726882776.42772: checking for any_errors_fatal 27844 1726882776.42773: done checking for any_errors_fatal 27844 1726882776.42774: checking for max_fail_percentage 27844 1726882776.42776: done checking for max_fail_percentage 27844 1726882776.42777: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.42778: done checking to see if all hosts have failed 27844 1726882776.42778: getting the remaining hosts for this loop 27844 1726882776.42780: done getting the remaining hosts for this loop 27844 1726882776.42784: getting the next task for host managed_node1 27844 1726882776.42792: done getting next task for host managed_node1 27844 1726882776.42794: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27844 1726882776.42799: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.42803: getting variables 27844 1726882776.42805: in VariableManager get_vars() 27844 1726882776.42850: Calling all_inventory to load vars for managed_node1 27844 1726882776.42853: Calling groups_inventory to load vars for managed_node1 27844 1726882776.42855: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.42868: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.42870: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.42873: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.44687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.46771: done with get_vars() 27844 1726882776.46786: done getting variables 27844 1726882776.46857: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882776.46953: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:39:36 -0400 (0:00:00.416) 0:00:35.546 ****** 27844 1726882776.46980: entering _queue_task() for managed_node1/assert 27844 1726882776.47211: worker is 1 (out of 1 available) 27844 1726882776.47226: exiting _queue_task() for managed_node1/assert 27844 1726882776.47238: done queuing things up, now waiting for results queue to drain 27844 1726882776.47240: waiting for pending results... 27844 1726882776.47417: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' 27844 1726882776.47509: in run() - task 0e448fcc-3ce9-efa9-466a-000000000991 27844 1726882776.47519: variable 'ansible_search_path' from source: unknown 27844 1726882776.47523: variable 'ansible_search_path' from source: unknown 27844 1726882776.47553: calling self._execute() 27844 1726882776.47653: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.47657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.47667: variable 'omit' from source: magic vars 27844 1726882776.48188: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.48193: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.48196: variable 'omit' from source: magic vars 27844 1726882776.48199: variable 'omit' from source: magic vars 27844 1726882776.48201: variable 'interface' from source: set_fact 27844 1726882776.48476: variable 'omit' from source: magic vars 27844 1726882776.48479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882776.48482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882776.48485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882776.48487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.48490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.48493: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882776.48495: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.48497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.48500: Set connection var ansible_shell_type to sh 27844 1726882776.48502: Set connection var ansible_connection to ssh 27844 1726882776.48504: Set connection var ansible_pipelining to False 27844 1726882776.48520: Set connection var ansible_timeout to 10 27844 1726882776.48523: Set connection var ansible_shell_executable to /bin/sh 27844 1726882776.48526: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882776.49540: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.49543: variable 'ansible_connection' from source: unknown 27844 1726882776.49545: variable 'ansible_module_compression' from source: unknown 27844 1726882776.49548: variable 'ansible_shell_type' from source: unknown 27844 1726882776.49549: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.49551: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.49553: variable 'ansible_pipelining' from source: unknown 27844 1726882776.49555: variable 'ansible_timeout' from source: unknown 27844 1726882776.49557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.49559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882776.49561: variable 'omit' from source: magic vars 27844 1726882776.49565: starting attempt loop 27844 1726882776.49567: running the handler 27844 1726882776.49569: variable 'interface_stat' from source: set_fact 27844 1726882776.49571: Evaluated conditional (not interface_stat.stat.exists): True 27844 1726882776.49572: handler run complete 27844 1726882776.49574: attempt loop complete, returning result 27844 1726882776.49575: _execute() done 27844 1726882776.49577: dumping result to json 27844 1726882776.49579: done dumping result, returning 27844 1726882776.49581: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'ethtest0' [0e448fcc-3ce9-efa9-466a-000000000991] 27844 1726882776.49583: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000991 27844 1726882776.49645: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000991 27844 1726882776.49649: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882776.49702: no more pending results, returning what we have 27844 1726882776.49705: results queue empty 27844 1726882776.49706: checking for any_errors_fatal 27844 1726882776.49711: done checking for any_errors_fatal 27844 1726882776.49712: checking for max_fail_percentage 27844 1726882776.49713: done checking for max_fail_percentage 27844 1726882776.49714: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.49715: done checking to see if all hosts have failed 27844 1726882776.49716: getting the remaining hosts for this loop 27844 1726882776.49717: done getting the remaining hosts for this loop 27844 1726882776.49721: getting the next task for host managed_node1 27844 1726882776.49727: done getting next task for host managed_node1 27844 1726882776.49730: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 27844 1726882776.49733: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.49737: getting variables 27844 1726882776.49738: in VariableManager get_vars() 27844 1726882776.49784: Calling all_inventory to load vars for managed_node1 27844 1726882776.49788: Calling groups_inventory to load vars for managed_node1 27844 1726882776.49790: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.49799: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.49802: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.49805: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.51805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.53141: done with get_vars() 27844 1726882776.53156: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Friday 20 September 2024 21:39:36 -0400 (0:00:00.062) 0:00:35.609 ****** 27844 1726882776.53247: entering _queue_task() for managed_node1/include_tasks 27844 1726882776.53531: worker is 1 (out of 1 available) 27844 1726882776.53549: exiting _queue_task() for managed_node1/include_tasks 27844 1726882776.53561: done queuing things up, now waiting for results queue to drain 27844 1726882776.53563: waiting for pending results... 27844 1726882776.53987: running TaskExecutor() for managed_node1/TASK: Assert interface0 profile and interface1 profile are absent 27844 1726882776.53993: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000ba 27844 1726882776.53997: variable 'ansible_search_path' from source: unknown 27844 1726882776.54049: variable 'interface0' from source: play vars 27844 1726882776.54255: variable 'interface0' from source: play vars 27844 1726882776.54267: variable 'interface1' from source: play vars 27844 1726882776.54428: variable 'interface1' from source: play vars 27844 1726882776.54431: variable 'omit' from source: magic vars 27844 1726882776.54741: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.54751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.54765: variable 'omit' from source: magic vars 27844 1726882776.55025: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.55043: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.55081: variable 'item' from source: unknown 27844 1726882776.55144: variable 'item' from source: unknown 27844 1726882776.55291: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.55296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.55299: variable 'omit' from source: magic vars 27844 1726882776.55437: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.55443: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.55481: variable 'item' from source: unknown 27844 1726882776.55546: variable 'item' from source: unknown 27844 1726882776.55620: dumping result to json 27844 1726882776.55624: done dumping result, returning 27844 1726882776.55626: done running TaskExecutor() for managed_node1/TASK: Assert interface0 profile and interface1 profile are absent [0e448fcc-3ce9-efa9-466a-0000000000ba] 27844 1726882776.55628: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000ba 27844 1726882776.55667: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000ba 27844 1726882776.55697: no more pending results, returning what we have 27844 1726882776.55703: in VariableManager get_vars() 27844 1726882776.55751: Calling all_inventory to load vars for managed_node1 27844 1726882776.55755: Calling groups_inventory to load vars for managed_node1 27844 1726882776.55757: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.55777: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.55780: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.55784: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.56303: WORKER PROCESS EXITING 27844 1726882776.57275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.58190: done with get_vars() 27844 1726882776.58203: variable 'ansible_search_path' from source: unknown 27844 1726882776.58213: variable 'ansible_search_path' from source: unknown 27844 1726882776.58218: we have included files to process 27844 1726882776.58219: generating all_blocks data 27844 1726882776.58220: done generating all_blocks data 27844 1726882776.58223: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58224: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58225: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58331: in VariableManager get_vars() 27844 1726882776.58346: done with get_vars() 27844 1726882776.58460: done processing included file 27844 1726882776.58462: iterating over new_blocks loaded from include file 27844 1726882776.58468: in VariableManager get_vars() 27844 1726882776.58485: done with get_vars() 27844 1726882776.58487: filtering new block on tags 27844 1726882776.58533: done filtering new block on tags 27844 1726882776.58549: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 => (item=ethtest0) 27844 1726882776.58555: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58556: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58559: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27844 1726882776.58644: in VariableManager get_vars() 27844 1726882776.58668: done with get_vars() 27844 1726882776.58769: done processing included file 27844 1726882776.58772: iterating over new_blocks loaded from include file 27844 1726882776.58773: in VariableManager get_vars() 27844 1726882776.58793: done with get_vars() 27844 1726882776.58795: filtering new block on tags 27844 1726882776.58832: done filtering new block on tags 27844 1726882776.58834: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 => (item=ethtest1) 27844 1726882776.58837: extending task lists for all hosts with included blocks 27844 1726882776.60205: done extending task lists 27844 1726882776.60206: done processing included files 27844 1726882776.60207: results queue empty 27844 1726882776.60207: checking for any_errors_fatal 27844 1726882776.60210: done checking for any_errors_fatal 27844 1726882776.60210: checking for max_fail_percentage 27844 1726882776.60211: done checking for max_fail_percentage 27844 1726882776.60211: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.60212: done checking to see if all hosts have failed 27844 1726882776.60212: getting the remaining hosts for this loop 27844 1726882776.60213: done getting the remaining hosts for this loop 27844 1726882776.60215: getting the next task for host managed_node1 27844 1726882776.60217: done getting next task for host managed_node1 27844 1726882776.60219: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27844 1726882776.60221: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.60223: getting variables 27844 1726882776.60223: in VariableManager get_vars() 27844 1726882776.60233: Calling all_inventory to load vars for managed_node1 27844 1726882776.60235: Calling groups_inventory to load vars for managed_node1 27844 1726882776.60240: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.60244: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.60246: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.60247: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.60908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.61827: done with get_vars() 27844 1726882776.61841: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:39:36 -0400 (0:00:00.086) 0:00:35.695 ****** 27844 1726882776.61897: entering _queue_task() for managed_node1/include_tasks 27844 1726882776.62119: worker is 1 (out of 1 available) 27844 1726882776.62132: exiting _queue_task() for managed_node1/include_tasks 27844 1726882776.62144: done queuing things up, now waiting for results queue to drain 27844 1726882776.62146: waiting for pending results... 27844 1726882776.62322: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 27844 1726882776.62393: in run() - task 0e448fcc-3ce9-efa9-466a-000000000a6c 27844 1726882776.62408: variable 'ansible_search_path' from source: unknown 27844 1726882776.62412: variable 'ansible_search_path' from source: unknown 27844 1726882776.62438: calling self._execute() 27844 1726882776.62515: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.62519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.62527: variable 'omit' from source: magic vars 27844 1726882776.62805: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.62815: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.62821: _execute() done 27844 1726882776.62826: dumping result to json 27844 1726882776.62828: done dumping result, returning 27844 1726882776.62831: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-efa9-466a-000000000a6c] 27844 1726882776.62839: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a6c 27844 1726882776.62922: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a6c 27844 1726882776.62925: WORKER PROCESS EXITING 27844 1726882776.62958: no more pending results, returning what we have 27844 1726882776.62965: in VariableManager get_vars() 27844 1726882776.63010: Calling all_inventory to load vars for managed_node1 27844 1726882776.63013: Calling groups_inventory to load vars for managed_node1 27844 1726882776.63015: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.63026: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.63029: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.63031: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.63886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.64826: done with get_vars() 27844 1726882776.64839: variable 'ansible_search_path' from source: unknown 27844 1726882776.64840: variable 'ansible_search_path' from source: unknown 27844 1726882776.64863: we have included files to process 27844 1726882776.64865: generating all_blocks data 27844 1726882776.64869: done generating all_blocks data 27844 1726882776.64869: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882776.64870: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882776.64871: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882776.65551: done processing included file 27844 1726882776.65552: iterating over new_blocks loaded from include file 27844 1726882776.65553: in VariableManager get_vars() 27844 1726882776.65570: done with get_vars() 27844 1726882776.65571: filtering new block on tags 27844 1726882776.65608: done filtering new block on tags 27844 1726882776.65636: in VariableManager get_vars() 27844 1726882776.65649: done with get_vars() 27844 1726882776.65650: filtering new block on tags 27844 1726882776.65687: done filtering new block on tags 27844 1726882776.65689: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 27844 1726882776.65692: extending task lists for all hosts with included blocks 27844 1726882776.65759: done extending task lists 27844 1726882776.65760: done processing included files 27844 1726882776.65760: results queue empty 27844 1726882776.65761: checking for any_errors_fatal 27844 1726882776.65765: done checking for any_errors_fatal 27844 1726882776.65768: checking for max_fail_percentage 27844 1726882776.65769: done checking for max_fail_percentage 27844 1726882776.65769: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.65770: done checking to see if all hosts have failed 27844 1726882776.65770: getting the remaining hosts for this loop 27844 1726882776.65771: done getting the remaining hosts for this loop 27844 1726882776.65773: getting the next task for host managed_node1 27844 1726882776.65776: done getting next task for host managed_node1 27844 1726882776.65777: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27844 1726882776.65779: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.65781: getting variables 27844 1726882776.65781: in VariableManager get_vars() 27844 1726882776.65789: Calling all_inventory to load vars for managed_node1 27844 1726882776.65791: Calling groups_inventory to load vars for managed_node1 27844 1726882776.65792: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.65795: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.65797: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.65798: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.66478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.67387: done with get_vars() 27844 1726882776.67403: done getting variables 27844 1726882776.67427: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:39:36 -0400 (0:00:00.055) 0:00:35.751 ****** 27844 1726882776.67450: entering _queue_task() for managed_node1/set_fact 27844 1726882776.67654: worker is 1 (out of 1 available) 27844 1726882776.67671: exiting _queue_task() for managed_node1/set_fact 27844 1726882776.67686: done queuing things up, now waiting for results queue to drain 27844 1726882776.67687: waiting for pending results... 27844 1726882776.67858: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 27844 1726882776.67933: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b3c 27844 1726882776.67946: variable 'ansible_search_path' from source: unknown 27844 1726882776.67949: variable 'ansible_search_path' from source: unknown 27844 1726882776.67980: calling self._execute() 27844 1726882776.68055: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.68058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.68072: variable 'omit' from source: magic vars 27844 1726882776.68327: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.68337: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.68344: variable 'omit' from source: magic vars 27844 1726882776.68380: variable 'omit' from source: magic vars 27844 1726882776.68403: variable 'omit' from source: magic vars 27844 1726882776.68433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882776.68459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882776.68479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882776.68494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.68504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.68527: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882776.68530: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.68534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.68606: Set connection var ansible_shell_type to sh 27844 1726882776.68610: Set connection var ansible_connection to ssh 27844 1726882776.68612: Set connection var ansible_pipelining to False 27844 1726882776.68617: Set connection var ansible_timeout to 10 27844 1726882776.68623: Set connection var ansible_shell_executable to /bin/sh 27844 1726882776.68628: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882776.68648: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.68651: variable 'ansible_connection' from source: unknown 27844 1726882776.68655: variable 'ansible_module_compression' from source: unknown 27844 1726882776.68657: variable 'ansible_shell_type' from source: unknown 27844 1726882776.68659: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.68661: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.68665: variable 'ansible_pipelining' from source: unknown 27844 1726882776.68670: variable 'ansible_timeout' from source: unknown 27844 1726882776.68672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.68769: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882776.68776: variable 'omit' from source: magic vars 27844 1726882776.68782: starting attempt loop 27844 1726882776.68785: running the handler 27844 1726882776.68794: handler run complete 27844 1726882776.68803: attempt loop complete, returning result 27844 1726882776.68806: _execute() done 27844 1726882776.68810: dumping result to json 27844 1726882776.68812: done dumping result, returning 27844 1726882776.68817: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-efa9-466a-000000000b3c] 27844 1726882776.68822: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3c 27844 1726882776.68905: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3c 27844 1726882776.68908: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27844 1726882776.68970: no more pending results, returning what we have 27844 1726882776.68973: results queue empty 27844 1726882776.68974: checking for any_errors_fatal 27844 1726882776.68976: done checking for any_errors_fatal 27844 1726882776.68976: checking for max_fail_percentage 27844 1726882776.68977: done checking for max_fail_percentage 27844 1726882776.68978: checking to see if all hosts have failed and the running result is not ok 27844 1726882776.68979: done checking to see if all hosts have failed 27844 1726882776.68980: getting the remaining hosts for this loop 27844 1726882776.68981: done getting the remaining hosts for this loop 27844 1726882776.68984: getting the next task for host managed_node1 27844 1726882776.68988: done getting next task for host managed_node1 27844 1726882776.68990: ^ task is: TASK: Stat profile file 27844 1726882776.68994: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882776.68997: getting variables 27844 1726882776.68999: in VariableManager get_vars() 27844 1726882776.69035: Calling all_inventory to load vars for managed_node1 27844 1726882776.69037: Calling groups_inventory to load vars for managed_node1 27844 1726882776.69039: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882776.69051: Calling all_plugins_play to load vars for managed_node1 27844 1726882776.69054: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882776.69057: Calling groups_plugins_play to load vars for managed_node1 27844 1726882776.69822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882776.70838: done with get_vars() 27844 1726882776.70852: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:39:36 -0400 (0:00:00.034) 0:00:35.785 ****** 27844 1726882776.70915: entering _queue_task() for managed_node1/stat 27844 1726882776.71102: worker is 1 (out of 1 available) 27844 1726882776.71116: exiting _queue_task() for managed_node1/stat 27844 1726882776.71129: done queuing things up, now waiting for results queue to drain 27844 1726882776.71131: waiting for pending results... 27844 1726882776.71295: running TaskExecutor() for managed_node1/TASK: Stat profile file 27844 1726882776.71370: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b3d 27844 1726882776.71379: variable 'ansible_search_path' from source: unknown 27844 1726882776.71382: variable 'ansible_search_path' from source: unknown 27844 1726882776.71412: calling self._execute() 27844 1726882776.71484: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.71488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.71495: variable 'omit' from source: magic vars 27844 1726882776.71751: variable 'ansible_distribution_major_version' from source: facts 27844 1726882776.71761: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882776.71771: variable 'omit' from source: magic vars 27844 1726882776.71804: variable 'omit' from source: magic vars 27844 1726882776.71873: variable 'profile' from source: include params 27844 1726882776.71878: variable 'item' from source: include params 27844 1726882776.71927: variable 'item' from source: include params 27844 1726882776.71941: variable 'omit' from source: magic vars 27844 1726882776.71975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882776.72002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882776.72017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882776.72031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.72040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882776.72061: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882776.72068: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.72071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.72137: Set connection var ansible_shell_type to sh 27844 1726882776.72140: Set connection var ansible_connection to ssh 27844 1726882776.72145: Set connection var ansible_pipelining to False 27844 1726882776.72150: Set connection var ansible_timeout to 10 27844 1726882776.72155: Set connection var ansible_shell_executable to /bin/sh 27844 1726882776.72160: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882776.72185: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.72189: variable 'ansible_connection' from source: unknown 27844 1726882776.72193: variable 'ansible_module_compression' from source: unknown 27844 1726882776.72195: variable 'ansible_shell_type' from source: unknown 27844 1726882776.72197: variable 'ansible_shell_executable' from source: unknown 27844 1726882776.72199: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882776.72202: variable 'ansible_pipelining' from source: unknown 27844 1726882776.72204: variable 'ansible_timeout' from source: unknown 27844 1726882776.72206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882776.72344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882776.72352: variable 'omit' from source: magic vars 27844 1726882776.72357: starting attempt loop 27844 1726882776.72360: running the handler 27844 1726882776.72374: _low_level_execute_command(): starting 27844 1726882776.72381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882776.72897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.72907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.72937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.72951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882776.72961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.73009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.73020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.73130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.74802: stdout chunk (state=3): >>>/root <<< 27844 1726882776.74911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.74958: stderr chunk (state=3): >>><<< 27844 1726882776.74962: stdout chunk (state=3): >>><<< 27844 1726882776.74985: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.74998: _low_level_execute_command(): starting 27844 1726882776.75002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563 `" && echo ansible-tmp-1726882776.7498345-29465-20589998004563="` echo /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563 `" ) && sleep 0' 27844 1726882776.75430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.75436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.75461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882776.75488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.75537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.75546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.75648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.77527: stdout chunk (state=3): >>>ansible-tmp-1726882776.7498345-29465-20589998004563=/root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563 <<< 27844 1726882776.77639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.77685: stderr chunk (state=3): >>><<< 27844 1726882776.77689: stdout chunk (state=3): >>><<< 27844 1726882776.77707: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882776.7498345-29465-20589998004563=/root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.77738: variable 'ansible_module_compression' from source: unknown 27844 1726882776.77786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882776.77820: variable 'ansible_facts' from source: unknown 27844 1726882776.77885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/AnsiballZ_stat.py 27844 1726882776.77986: Sending initial data 27844 1726882776.77989: Sent initial data (152 bytes) 27844 1726882776.78613: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.78619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.78648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.78661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.78677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.78722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.78733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.78831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.80547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882776.80635: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882776.80729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpbue5hjeu /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/AnsiballZ_stat.py <<< 27844 1726882776.80819: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882776.81836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.81926: stderr chunk (state=3): >>><<< 27844 1726882776.81929: stdout chunk (state=3): >>><<< 27844 1726882776.81944: done transferring module to remote 27844 1726882776.81952: _low_level_execute_command(): starting 27844 1726882776.81956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/ /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/AnsiballZ_stat.py && sleep 0' 27844 1726882776.82365: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.82375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.82402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.82415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.82468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882776.82481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.82581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.84294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882776.84333: stderr chunk (state=3): >>><<< 27844 1726882776.84336: stdout chunk (state=3): >>><<< 27844 1726882776.84347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882776.84350: _low_level_execute_command(): starting 27844 1726882776.84354: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/AnsiballZ_stat.py && sleep 0' 27844 1726882776.84754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.84759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.84805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.84808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.84810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.84870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882776.84873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882776.84975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882776.97894: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882776.98890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882776.98997: stderr chunk (state=3): >>><<< 27844 1726882776.99000: stdout chunk (state=3): >>><<< 27844 1726882776.99145: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882776.99150: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882776.99152: _low_level_execute_command(): starting 27844 1726882776.99155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882776.7498345-29465-20589998004563/ > /dev/null 2>&1 && sleep 0' 27844 1726882776.99752: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882776.99771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.99791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882776.99813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882776.99865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882776.99884: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882776.99902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882776.99924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882776.99944: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882776.99958: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882776.99978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882776.99995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.00013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.00026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.00042: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882777.00056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.00128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.00168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.00188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.00314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.02184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.02237: stderr chunk (state=3): >>><<< 27844 1726882777.02240: stdout chunk (state=3): >>><<< 27844 1726882777.02279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882777.02283: handler run complete 27844 1726882777.02377: attempt loop complete, returning result 27844 1726882777.02380: _execute() done 27844 1726882777.02382: dumping result to json 27844 1726882777.02384: done dumping result, returning 27844 1726882777.02387: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-efa9-466a-000000000b3d] 27844 1726882777.02389: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3d 27844 1726882777.02557: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3d 27844 1726882777.02560: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 27844 1726882777.02654: no more pending results, returning what we have 27844 1726882777.02658: results queue empty 27844 1726882777.02659: checking for any_errors_fatal 27844 1726882777.02675: done checking for any_errors_fatal 27844 1726882777.02677: checking for max_fail_percentage 27844 1726882777.02679: done checking for max_fail_percentage 27844 1726882777.02680: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.02681: done checking to see if all hosts have failed 27844 1726882777.02681: getting the remaining hosts for this loop 27844 1726882777.02683: done getting the remaining hosts for this loop 27844 1726882777.02687: getting the next task for host managed_node1 27844 1726882777.02696: done getting next task for host managed_node1 27844 1726882777.02699: ^ task is: TASK: Set NM profile exist flag based on the profile files 27844 1726882777.02705: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.02710: getting variables 27844 1726882777.02712: in VariableManager get_vars() 27844 1726882777.02761: Calling all_inventory to load vars for managed_node1 27844 1726882777.02773: Calling groups_inventory to load vars for managed_node1 27844 1726882777.02777: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.02791: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.02794: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.02797: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.05034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.07652: done with get_vars() 27844 1726882777.07679: done getting variables 27844 1726882777.07737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:39:37 -0400 (0:00:00.368) 0:00:36.154 ****** 27844 1726882777.07778: entering _queue_task() for managed_node1/set_fact 27844 1726882777.08121: worker is 1 (out of 1 available) 27844 1726882777.08137: exiting _queue_task() for managed_node1/set_fact 27844 1726882777.08152: done queuing things up, now waiting for results queue to drain 27844 1726882777.08153: waiting for pending results... 27844 1726882777.08488: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 27844 1726882777.10079: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b3e 27844 1726882777.10097: variable 'ansible_search_path' from source: unknown 27844 1726882777.10104: variable 'ansible_search_path' from source: unknown 27844 1726882777.10145: calling self._execute() 27844 1726882777.10262: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.10390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.10403: variable 'omit' from source: magic vars 27844 1726882777.11115: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.11262: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.11508: variable 'profile_stat' from source: set_fact 27844 1726882777.11523: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882777.11529: when evaluation is False, skipping this task 27844 1726882777.11536: _execute() done 27844 1726882777.11543: dumping result to json 27844 1726882777.11550: done dumping result, returning 27844 1726882777.11559: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-efa9-466a-000000000b3e] 27844 1726882777.11575: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3e 27844 1726882777.11786: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882777.11835: no more pending results, returning what we have 27844 1726882777.11841: results queue empty 27844 1726882777.11842: checking for any_errors_fatal 27844 1726882777.11850: done checking for any_errors_fatal 27844 1726882777.11851: checking for max_fail_percentage 27844 1726882777.11853: done checking for max_fail_percentage 27844 1726882777.11854: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.11855: done checking to see if all hosts have failed 27844 1726882777.11856: getting the remaining hosts for this loop 27844 1726882777.11857: done getting the remaining hosts for this loop 27844 1726882777.11861: getting the next task for host managed_node1 27844 1726882777.11873: done getting next task for host managed_node1 27844 1726882777.11877: ^ task is: TASK: Get NM profile info 27844 1726882777.11882: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.11887: getting variables 27844 1726882777.11889: in VariableManager get_vars() 27844 1726882777.11935: Calling all_inventory to load vars for managed_node1 27844 1726882777.11938: Calling groups_inventory to load vars for managed_node1 27844 1726882777.11941: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.11955: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.11958: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.11961: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.12986: WORKER PROCESS EXITING 27844 1726882777.14520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.18545: done with get_vars() 27844 1726882777.18577: done getting variables 27844 1726882777.19377: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:39:37 -0400 (0:00:00.116) 0:00:36.270 ****** 27844 1726882777.19412: entering _queue_task() for managed_node1/shell 27844 1726882777.19414: Creating lock for shell 27844 1726882777.19771: worker is 1 (out of 1 available) 27844 1726882777.19785: exiting _queue_task() for managed_node1/shell 27844 1726882777.19798: done queuing things up, now waiting for results queue to drain 27844 1726882777.19800: waiting for pending results... 27844 1726882777.20658: running TaskExecutor() for managed_node1/TASK: Get NM profile info 27844 1726882777.20880: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b3f 27844 1726882777.20894: variable 'ansible_search_path' from source: unknown 27844 1726882777.20898: variable 'ansible_search_path' from source: unknown 27844 1726882777.21057: calling self._execute() 27844 1726882777.21268: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.21272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.21280: variable 'omit' from source: magic vars 27844 1726882777.21979: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.21992: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.21999: variable 'omit' from source: magic vars 27844 1726882777.22169: variable 'omit' from source: magic vars 27844 1726882777.22377: variable 'profile' from source: include params 27844 1726882777.22380: variable 'item' from source: include params 27844 1726882777.22568: variable 'item' from source: include params 27844 1726882777.22584: variable 'omit' from source: magic vars 27844 1726882777.22624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882777.22772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882777.22793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882777.22810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882777.22821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882777.22983: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882777.22986: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.22991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.23219: Set connection var ansible_shell_type to sh 27844 1726882777.23223: Set connection var ansible_connection to ssh 27844 1726882777.23228: Set connection var ansible_pipelining to False 27844 1726882777.23234: Set connection var ansible_timeout to 10 27844 1726882777.23239: Set connection var ansible_shell_executable to /bin/sh 27844 1726882777.23244: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882777.23273: variable 'ansible_shell_executable' from source: unknown 27844 1726882777.23276: variable 'ansible_connection' from source: unknown 27844 1726882777.23279: variable 'ansible_module_compression' from source: unknown 27844 1726882777.23281: variable 'ansible_shell_type' from source: unknown 27844 1726882777.23283: variable 'ansible_shell_executable' from source: unknown 27844 1726882777.23285: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.23288: variable 'ansible_pipelining' from source: unknown 27844 1726882777.23291: variable 'ansible_timeout' from source: unknown 27844 1726882777.23295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.23654: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882777.23671: variable 'omit' from source: magic vars 27844 1726882777.23674: starting attempt loop 27844 1726882777.23677: running the handler 27844 1726882777.23686: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882777.23703: _low_level_execute_command(): starting 27844 1726882777.23712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882777.25838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.25847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.26017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882777.26025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27844 1726882777.26100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.26107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882777.26121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.26268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.26425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.26429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.26643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.28208: stdout chunk (state=3): >>>/root <<< 27844 1726882777.28374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.28378: stderr chunk (state=3): >>><<< 27844 1726882777.28383: stdout chunk (state=3): >>><<< 27844 1726882777.28409: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882777.28420: _low_level_execute_command(): starting 27844 1726882777.28427: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076 `" && echo ansible-tmp-1726882777.2840726-29480-276146050283076="` echo /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076 `" ) && sleep 0' 27844 1726882777.29852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.29858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.29972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.29978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882777.29983: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882777.29996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.30001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.30008: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882777.30022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.30114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.30274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.30492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.32256: stdout chunk (state=3): >>>ansible-tmp-1726882777.2840726-29480-276146050283076=/root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076 <<< 27844 1726882777.32384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.32470: stderr chunk (state=3): >>><<< 27844 1726882777.32474: stdout chunk (state=3): >>><<< 27844 1726882777.32776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882777.2840726-29480-276146050283076=/root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882777.32780: variable 'ansible_module_compression' from source: unknown 27844 1726882777.32782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882777.32785: variable 'ansible_facts' from source: unknown 27844 1726882777.32787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/AnsiballZ_command.py 27844 1726882777.32849: Sending initial data 27844 1726882777.32855: Sent initial data (156 bytes) 27844 1726882777.34495: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882777.34510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.34531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.34548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.34591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.34602: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882777.34614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.34636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882777.34648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882777.34658: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882777.34674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.34689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.34703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.34714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.34724: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882777.34736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.34811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.34834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.34854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.34989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.36716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882777.36802: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882777.36904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp8ese8w8a /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/AnsiballZ_command.py <<< 27844 1726882777.36997: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882777.38774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.38778: stderr chunk (state=3): >>><<< 27844 1726882777.38780: stdout chunk (state=3): >>><<< 27844 1726882777.38782: done transferring module to remote 27844 1726882777.38785: _low_level_execute_command(): starting 27844 1726882777.38787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/ /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/AnsiballZ_command.py && sleep 0' 27844 1726882777.39393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882777.39410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.39430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.39448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.39494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.39513: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882777.39527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.39544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882777.39555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882777.39570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882777.39583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.39596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.39612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.39627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882777.39637: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882777.39649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.39723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.39741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.39755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.39881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.41602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.41674: stderr chunk (state=3): >>><<< 27844 1726882777.41691: stdout chunk (state=3): >>><<< 27844 1726882777.41792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882777.41798: _low_level_execute_command(): starting 27844 1726882777.41801: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/AnsiballZ_command.py && sleep 0' 27844 1726882777.42497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.42501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.42545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882777.42549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.42555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882777.42557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.42944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.42958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.42968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.43125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.57917: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:39:37.560520", "end": "2024-09-20 21:39:37.577607", "delta": "0:00:00.017087", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882777.59021: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 27844 1726882777.59078: stderr chunk (state=3): >>><<< 27844 1726882777.59082: stdout chunk (state=3): >>><<< 27844 1726882777.59100: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:39:37.560520", "end": "2024-09-20 21:39:37.577607", "delta": "0:00:00.017087", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 27844 1726882777.59129: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882777.59136: _low_level_execute_command(): starting 27844 1726882777.59141: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882777.2840726-29480-276146050283076/ > /dev/null 2>&1 && sleep 0' 27844 1726882777.59596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882777.59600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882777.59646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.59650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882777.59652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882777.59704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882777.59710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882777.59720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882777.59824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882777.61604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882777.61650: stderr chunk (state=3): >>><<< 27844 1726882777.61653: stdout chunk (state=3): >>><<< 27844 1726882777.61667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882777.61676: handler run complete 27844 1726882777.61694: Evaluated conditional (False): False 27844 1726882777.61702: attempt loop complete, returning result 27844 1726882777.61705: _execute() done 27844 1726882777.61707: dumping result to json 27844 1726882777.61711: done dumping result, returning 27844 1726882777.61720: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-efa9-466a-000000000b3f] 27844 1726882777.61724: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3f 27844 1726882777.61818: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b3f 27844 1726882777.61822: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017087", "end": "2024-09-20 21:39:37.577607", "rc": 1, "start": "2024-09-20 21:39:37.560520" } MSG: non-zero return code ...ignoring 27844 1726882777.61894: no more pending results, returning what we have 27844 1726882777.61898: results queue empty 27844 1726882777.61899: checking for any_errors_fatal 27844 1726882777.61905: done checking for any_errors_fatal 27844 1726882777.61906: checking for max_fail_percentage 27844 1726882777.61907: done checking for max_fail_percentage 27844 1726882777.61908: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.61909: done checking to see if all hosts have failed 27844 1726882777.61910: getting the remaining hosts for this loop 27844 1726882777.61911: done getting the remaining hosts for this loop 27844 1726882777.61915: getting the next task for host managed_node1 27844 1726882777.61921: done getting next task for host managed_node1 27844 1726882777.61924: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27844 1726882777.61928: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.61932: getting variables 27844 1726882777.61933: in VariableManager get_vars() 27844 1726882777.61978: Calling all_inventory to load vars for managed_node1 27844 1726882777.61981: Calling groups_inventory to load vars for managed_node1 27844 1726882777.61983: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.61994: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.61997: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.61999: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.63196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.64117: done with get_vars() 27844 1726882777.64134: done getting variables 27844 1726882777.64179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:39:37 -0400 (0:00:00.447) 0:00:36.718 ****** 27844 1726882777.64207: entering _queue_task() for managed_node1/set_fact 27844 1726882777.64417: worker is 1 (out of 1 available) 27844 1726882777.64431: exiting _queue_task() for managed_node1/set_fact 27844 1726882777.64446: done queuing things up, now waiting for results queue to drain 27844 1726882777.64448: waiting for pending results... 27844 1726882777.64627: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27844 1726882777.64705: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b40 27844 1726882777.64717: variable 'ansible_search_path' from source: unknown 27844 1726882777.64721: variable 'ansible_search_path' from source: unknown 27844 1726882777.64752: calling self._execute() 27844 1726882777.64831: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.64835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.64844: variable 'omit' from source: magic vars 27844 1726882777.65117: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.65128: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.65220: variable 'nm_profile_exists' from source: set_fact 27844 1726882777.65229: Evaluated conditional (nm_profile_exists.rc == 0): False 27844 1726882777.65232: when evaluation is False, skipping this task 27844 1726882777.65235: _execute() done 27844 1726882777.65237: dumping result to json 27844 1726882777.65240: done dumping result, returning 27844 1726882777.65247: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-efa9-466a-000000000b40] 27844 1726882777.65252: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b40 27844 1726882777.65337: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b40 27844 1726882777.65341: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27844 1726882777.65390: no more pending results, returning what we have 27844 1726882777.65394: results queue empty 27844 1726882777.65395: checking for any_errors_fatal 27844 1726882777.65401: done checking for any_errors_fatal 27844 1726882777.65402: checking for max_fail_percentage 27844 1726882777.65403: done checking for max_fail_percentage 27844 1726882777.65404: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.65405: done checking to see if all hosts have failed 27844 1726882777.65405: getting the remaining hosts for this loop 27844 1726882777.65406: done getting the remaining hosts for this loop 27844 1726882777.65409: getting the next task for host managed_node1 27844 1726882777.65416: done getting next task for host managed_node1 27844 1726882777.65419: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27844 1726882777.65423: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.65427: getting variables 27844 1726882777.65428: in VariableManager get_vars() 27844 1726882777.65467: Calling all_inventory to load vars for managed_node1 27844 1726882777.65470: Calling groups_inventory to load vars for managed_node1 27844 1726882777.65472: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.65482: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.65484: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.65487: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.66341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.67253: done with get_vars() 27844 1726882777.67270: done getting variables 27844 1726882777.67311: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882777.67400: variable 'profile' from source: include params 27844 1726882777.67404: variable 'item' from source: include params 27844 1726882777.67446: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:39:37 -0400 (0:00:00.032) 0:00:36.751 ****** 27844 1726882777.67472: entering _queue_task() for managed_node1/command 27844 1726882777.67685: worker is 1 (out of 1 available) 27844 1726882777.67699: exiting _queue_task() for managed_node1/command 27844 1726882777.67713: done queuing things up, now waiting for results queue to drain 27844 1726882777.67715: waiting for pending results... 27844 1726882777.67899: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 27844 1726882777.67977: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b42 27844 1726882777.67988: variable 'ansible_search_path' from source: unknown 27844 1726882777.67992: variable 'ansible_search_path' from source: unknown 27844 1726882777.68024: calling self._execute() 27844 1726882777.68104: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.68108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.68116: variable 'omit' from source: magic vars 27844 1726882777.68373: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.68386: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.68470: variable 'profile_stat' from source: set_fact 27844 1726882777.68478: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882777.68481: when evaluation is False, skipping this task 27844 1726882777.68483: _execute() done 27844 1726882777.68488: dumping result to json 27844 1726882777.68490: done dumping result, returning 27844 1726882777.68493: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-efa9-466a-000000000b42] 27844 1726882777.68500: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b42 27844 1726882777.68582: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b42 27844 1726882777.68585: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882777.68653: no more pending results, returning what we have 27844 1726882777.68656: results queue empty 27844 1726882777.68657: checking for any_errors_fatal 27844 1726882777.68662: done checking for any_errors_fatal 27844 1726882777.68663: checking for max_fail_percentage 27844 1726882777.68668: done checking for max_fail_percentage 27844 1726882777.68669: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.68670: done checking to see if all hosts have failed 27844 1726882777.68670: getting the remaining hosts for this loop 27844 1726882777.68672: done getting the remaining hosts for this loop 27844 1726882777.68675: getting the next task for host managed_node1 27844 1726882777.68680: done getting next task for host managed_node1 27844 1726882777.68682: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27844 1726882777.68687: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.68691: getting variables 27844 1726882777.68692: in VariableManager get_vars() 27844 1726882777.68730: Calling all_inventory to load vars for managed_node1 27844 1726882777.68732: Calling groups_inventory to load vars for managed_node1 27844 1726882777.68735: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.68742: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.68744: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.68746: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.69874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.71361: done with get_vars() 27844 1726882777.71379: done getting variables 27844 1726882777.71419: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882777.71493: variable 'profile' from source: include params 27844 1726882777.71496: variable 'item' from source: include params 27844 1726882777.71532: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:39:37 -0400 (0:00:00.040) 0:00:36.792 ****** 27844 1726882777.71556: entering _queue_task() for managed_node1/set_fact 27844 1726882777.71746: worker is 1 (out of 1 available) 27844 1726882777.71760: exiting _queue_task() for managed_node1/set_fact 27844 1726882777.71777: done queuing things up, now waiting for results queue to drain 27844 1726882777.71779: waiting for pending results... 27844 1726882777.71939: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 27844 1726882777.72009: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b43 27844 1726882777.72023: variable 'ansible_search_path' from source: unknown 27844 1726882777.72026: variable 'ansible_search_path' from source: unknown 27844 1726882777.72052: calling self._execute() 27844 1726882777.72130: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.72133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.72141: variable 'omit' from source: magic vars 27844 1726882777.72385: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.72395: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.72479: variable 'profile_stat' from source: set_fact 27844 1726882777.72489: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882777.72492: when evaluation is False, skipping this task 27844 1726882777.72495: _execute() done 27844 1726882777.72497: dumping result to json 27844 1726882777.72499: done dumping result, returning 27844 1726882777.72505: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-efa9-466a-000000000b43] 27844 1726882777.72510: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b43 27844 1726882777.72597: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b43 27844 1726882777.72599: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882777.72647: no more pending results, returning what we have 27844 1726882777.72650: results queue empty 27844 1726882777.72651: checking for any_errors_fatal 27844 1726882777.72656: done checking for any_errors_fatal 27844 1726882777.72657: checking for max_fail_percentage 27844 1726882777.72658: done checking for max_fail_percentage 27844 1726882777.72659: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.72659: done checking to see if all hosts have failed 27844 1726882777.72660: getting the remaining hosts for this loop 27844 1726882777.72661: done getting the remaining hosts for this loop 27844 1726882777.72669: getting the next task for host managed_node1 27844 1726882777.72676: done getting next task for host managed_node1 27844 1726882777.72679: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27844 1726882777.72683: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.72686: getting variables 27844 1726882777.72688: in VariableManager get_vars() 27844 1726882777.72720: Calling all_inventory to load vars for managed_node1 27844 1726882777.72722: Calling groups_inventory to load vars for managed_node1 27844 1726882777.72724: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.72731: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.72732: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.72734: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.73482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.74425: done with get_vars() 27844 1726882777.74438: done getting variables 27844 1726882777.74483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882777.74551: variable 'profile' from source: include params 27844 1726882777.74554: variable 'item' from source: include params 27844 1726882777.74595: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:39:37 -0400 (0:00:00.030) 0:00:36.822 ****** 27844 1726882777.74617: entering _queue_task() for managed_node1/command 27844 1726882777.74799: worker is 1 (out of 1 available) 27844 1726882777.74814: exiting _queue_task() for managed_node1/command 27844 1726882777.74826: done queuing things up, now waiting for results queue to drain 27844 1726882777.74828: waiting for pending results... 27844 1726882777.74986: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 27844 1726882777.75052: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b44 27844 1726882777.75065: variable 'ansible_search_path' from source: unknown 27844 1726882777.75072: variable 'ansible_search_path' from source: unknown 27844 1726882777.75098: calling self._execute() 27844 1726882777.75174: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.75179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.75188: variable 'omit' from source: magic vars 27844 1726882777.75431: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.75442: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.75554: variable 'profile_stat' from source: set_fact 27844 1726882777.75574: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882777.75581: when evaluation is False, skipping this task 27844 1726882777.75587: _execute() done 27844 1726882777.75593: dumping result to json 27844 1726882777.75601: done dumping result, returning 27844 1726882777.75612: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-efa9-466a-000000000b44] 27844 1726882777.75620: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b44 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882777.75769: no more pending results, returning what we have 27844 1726882777.75773: results queue empty 27844 1726882777.75775: checking for any_errors_fatal 27844 1726882777.75784: done checking for any_errors_fatal 27844 1726882777.75785: checking for max_fail_percentage 27844 1726882777.75786: done checking for max_fail_percentage 27844 1726882777.75787: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.75788: done checking to see if all hosts have failed 27844 1726882777.75789: getting the remaining hosts for this loop 27844 1726882777.75791: done getting the remaining hosts for this loop 27844 1726882777.75794: getting the next task for host managed_node1 27844 1726882777.75800: done getting next task for host managed_node1 27844 1726882777.75804: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27844 1726882777.75809: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.75814: getting variables 27844 1726882777.75815: in VariableManager get_vars() 27844 1726882777.75979: Calling all_inventory to load vars for managed_node1 27844 1726882777.75983: Calling groups_inventory to load vars for managed_node1 27844 1726882777.75986: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.75993: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b44 27844 1726882777.75996: WORKER PROCESS EXITING 27844 1726882777.76008: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.76012: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.76014: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.77287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.78194: done with get_vars() 27844 1726882777.78209: done getting variables 27844 1726882777.78248: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882777.78322: variable 'profile' from source: include params 27844 1726882777.78325: variable 'item' from source: include params 27844 1726882777.78362: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:39:37 -0400 (0:00:00.037) 0:00:36.860 ****** 27844 1726882777.78387: entering _queue_task() for managed_node1/set_fact 27844 1726882777.78571: worker is 1 (out of 1 available) 27844 1726882777.78587: exiting _queue_task() for managed_node1/set_fact 27844 1726882777.78600: done queuing things up, now waiting for results queue to drain 27844 1726882777.78601: waiting for pending results... 27844 1726882777.79187: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 27844 1726882777.79447: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b45 27844 1726882777.79474: variable 'ansible_search_path' from source: unknown 27844 1726882777.79485: variable 'ansible_search_path' from source: unknown 27844 1726882777.79525: calling self._execute() 27844 1726882777.79628: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.79640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.79653: variable 'omit' from source: magic vars 27844 1726882777.80015: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.80032: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.80160: variable 'profile_stat' from source: set_fact 27844 1726882777.80187: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882777.80196: when evaluation is False, skipping this task 27844 1726882777.80204: _execute() done 27844 1726882777.80211: dumping result to json 27844 1726882777.80219: done dumping result, returning 27844 1726882777.80227: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-efa9-466a-000000000b45] 27844 1726882777.80237: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b45 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882777.80387: no more pending results, returning what we have 27844 1726882777.80392: results queue empty 27844 1726882777.80393: checking for any_errors_fatal 27844 1726882777.80400: done checking for any_errors_fatal 27844 1726882777.80401: checking for max_fail_percentage 27844 1726882777.80403: done checking for max_fail_percentage 27844 1726882777.80404: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.80405: done checking to see if all hosts have failed 27844 1726882777.80405: getting the remaining hosts for this loop 27844 1726882777.80407: done getting the remaining hosts for this loop 27844 1726882777.80411: getting the next task for host managed_node1 27844 1726882777.80419: done getting next task for host managed_node1 27844 1726882777.80422: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27844 1726882777.80427: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.80433: getting variables 27844 1726882777.80434: in VariableManager get_vars() 27844 1726882777.80484: Calling all_inventory to load vars for managed_node1 27844 1726882777.80487: Calling groups_inventory to load vars for managed_node1 27844 1726882777.80489: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.80503: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.80506: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.80509: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.81485: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b45 27844 1726882777.81488: WORKER PROCESS EXITING 27844 1726882777.82188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.88260: done with get_vars() 27844 1726882777.88288: done getting variables 27844 1726882777.88332: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882777.88419: variable 'profile' from source: include params 27844 1726882777.88422: variable 'item' from source: include params 27844 1726882777.88477: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:39:37 -0400 (0:00:00.101) 0:00:36.961 ****** 27844 1726882777.88502: entering _queue_task() for managed_node1/assert 27844 1726882777.88852: worker is 1 (out of 1 available) 27844 1726882777.88873: exiting _queue_task() for managed_node1/assert 27844 1726882777.88888: done queuing things up, now waiting for results queue to drain 27844 1726882777.88891: waiting for pending results... 27844 1726882777.89202: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' 27844 1726882777.89343: in run() - task 0e448fcc-3ce9-efa9-466a-000000000a6d 27844 1726882777.89369: variable 'ansible_search_path' from source: unknown 27844 1726882777.89379: variable 'ansible_search_path' from source: unknown 27844 1726882777.89426: calling self._execute() 27844 1726882777.89552: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.89569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.89585: variable 'omit' from source: magic vars 27844 1726882777.89973: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.89995: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.90006: variable 'omit' from source: magic vars 27844 1726882777.90056: variable 'omit' from source: magic vars 27844 1726882777.90162: variable 'profile' from source: include params 27844 1726882777.90177: variable 'item' from source: include params 27844 1726882777.90250: variable 'item' from source: include params 27844 1726882777.90280: variable 'omit' from source: magic vars 27844 1726882777.90334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882777.90379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882777.90405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882777.90432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882777.90450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882777.90488: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882777.90497: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.90506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.90618: Set connection var ansible_shell_type to sh 27844 1726882777.90626: Set connection var ansible_connection to ssh 27844 1726882777.90640: Set connection var ansible_pipelining to False 27844 1726882777.90651: Set connection var ansible_timeout to 10 27844 1726882777.90661: Set connection var ansible_shell_executable to /bin/sh 27844 1726882777.90678: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882777.90712: variable 'ansible_shell_executable' from source: unknown 27844 1726882777.90720: variable 'ansible_connection' from source: unknown 27844 1726882777.90728: variable 'ansible_module_compression' from source: unknown 27844 1726882777.90734: variable 'ansible_shell_type' from source: unknown 27844 1726882777.90741: variable 'ansible_shell_executable' from source: unknown 27844 1726882777.90752: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.90759: variable 'ansible_pipelining' from source: unknown 27844 1726882777.90771: variable 'ansible_timeout' from source: unknown 27844 1726882777.90779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.90930: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882777.90948: variable 'omit' from source: magic vars 27844 1726882777.90958: starting attempt loop 27844 1726882777.90973: running the handler 27844 1726882777.91102: variable 'lsr_net_profile_exists' from source: set_fact 27844 1726882777.91112: Evaluated conditional (not lsr_net_profile_exists): True 27844 1726882777.91121: handler run complete 27844 1726882777.91140: attempt loop complete, returning result 27844 1726882777.91146: _execute() done 27844 1726882777.91151: dumping result to json 27844 1726882777.91158: done dumping result, returning 27844 1726882777.91173: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest0' [0e448fcc-3ce9-efa9-466a-000000000a6d] 27844 1726882777.91186: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a6d ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882777.91340: no more pending results, returning what we have 27844 1726882777.91343: results queue empty 27844 1726882777.91345: checking for any_errors_fatal 27844 1726882777.91355: done checking for any_errors_fatal 27844 1726882777.91355: checking for max_fail_percentage 27844 1726882777.91357: done checking for max_fail_percentage 27844 1726882777.91358: checking to see if all hosts have failed and the running result is not ok 27844 1726882777.91359: done checking to see if all hosts have failed 27844 1726882777.91360: getting the remaining hosts for this loop 27844 1726882777.91362: done getting the remaining hosts for this loop 27844 1726882777.91370: getting the next task for host managed_node1 27844 1726882777.91380: done getting next task for host managed_node1 27844 1726882777.91383: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27844 1726882777.91387: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882777.91391: getting variables 27844 1726882777.91393: in VariableManager get_vars() 27844 1726882777.91437: Calling all_inventory to load vars for managed_node1 27844 1726882777.91440: Calling groups_inventory to load vars for managed_node1 27844 1726882777.91442: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.91455: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.91459: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.91461: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.92484: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a6d 27844 1726882777.92487: WORKER PROCESS EXITING 27844 1726882777.93210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882777.94940: done with get_vars() 27844 1726882777.94965: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:39:37 -0400 (0:00:00.065) 0:00:37.027 ****** 27844 1726882777.95062: entering _queue_task() for managed_node1/include_tasks 27844 1726882777.95356: worker is 1 (out of 1 available) 27844 1726882777.95373: exiting _queue_task() for managed_node1/include_tasks 27844 1726882777.95388: done queuing things up, now waiting for results queue to drain 27844 1726882777.95390: waiting for pending results... 27844 1726882777.95692: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 27844 1726882777.95842: in run() - task 0e448fcc-3ce9-efa9-466a-000000000a71 27844 1726882777.95863: variable 'ansible_search_path' from source: unknown 27844 1726882777.95878: variable 'ansible_search_path' from source: unknown 27844 1726882777.95922: calling self._execute() 27844 1726882777.96036: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882777.96052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882777.96071: variable 'omit' from source: magic vars 27844 1726882777.96447: variable 'ansible_distribution_major_version' from source: facts 27844 1726882777.96470: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882777.96486: _execute() done 27844 1726882777.96493: dumping result to json 27844 1726882777.96501: done dumping result, returning 27844 1726882777.96510: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-efa9-466a-000000000a71] 27844 1726882777.96520: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a71 27844 1726882777.96658: no more pending results, returning what we have 27844 1726882777.96665: in VariableManager get_vars() 27844 1726882777.96720: Calling all_inventory to load vars for managed_node1 27844 1726882777.96724: Calling groups_inventory to load vars for managed_node1 27844 1726882777.96727: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882777.96743: Calling all_plugins_play to load vars for managed_node1 27844 1726882777.96747: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882777.96750: Calling groups_plugins_play to load vars for managed_node1 27844 1726882777.97941: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a71 27844 1726882777.97944: WORKER PROCESS EXITING 27844 1726882777.98612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882778.00348: done with get_vars() 27844 1726882778.00369: variable 'ansible_search_path' from source: unknown 27844 1726882778.00371: variable 'ansible_search_path' from source: unknown 27844 1726882778.00408: we have included files to process 27844 1726882778.00409: generating all_blocks data 27844 1726882778.00411: done generating all_blocks data 27844 1726882778.00417: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882778.00418: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882778.00420: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27844 1726882778.01316: done processing included file 27844 1726882778.01318: iterating over new_blocks loaded from include file 27844 1726882778.01320: in VariableManager get_vars() 27844 1726882778.01339: done with get_vars() 27844 1726882778.01341: filtering new block on tags 27844 1726882778.01418: done filtering new block on tags 27844 1726882778.01421: in VariableManager get_vars() 27844 1726882778.01442: done with get_vars() 27844 1726882778.01444: filtering new block on tags 27844 1726882778.01502: done filtering new block on tags 27844 1726882778.01505: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 27844 1726882778.01511: extending task lists for all hosts with included blocks 27844 1726882778.01639: done extending task lists 27844 1726882778.01641: done processing included files 27844 1726882778.01642: results queue empty 27844 1726882778.01642: checking for any_errors_fatal 27844 1726882778.01646: done checking for any_errors_fatal 27844 1726882778.01647: checking for max_fail_percentage 27844 1726882778.01648: done checking for max_fail_percentage 27844 1726882778.01648: checking to see if all hosts have failed and the running result is not ok 27844 1726882778.01649: done checking to see if all hosts have failed 27844 1726882778.01650: getting the remaining hosts for this loop 27844 1726882778.01651: done getting the remaining hosts for this loop 27844 1726882778.01654: getting the next task for host managed_node1 27844 1726882778.01658: done getting next task for host managed_node1 27844 1726882778.01661: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27844 1726882778.01668: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882778.01671: getting variables 27844 1726882778.01672: in VariableManager get_vars() 27844 1726882778.01685: Calling all_inventory to load vars for managed_node1 27844 1726882778.01687: Calling groups_inventory to load vars for managed_node1 27844 1726882778.01689: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882778.01694: Calling all_plugins_play to load vars for managed_node1 27844 1726882778.01697: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882778.01700: Calling groups_plugins_play to load vars for managed_node1 27844 1726882778.02969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882778.04647: done with get_vars() 27844 1726882778.04673: done getting variables 27844 1726882778.04714: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:39:38 -0400 (0:00:00.096) 0:00:37.124 ****** 27844 1726882778.04744: entering _queue_task() for managed_node1/set_fact 27844 1726882778.05060: worker is 1 (out of 1 available) 27844 1726882778.05077: exiting _queue_task() for managed_node1/set_fact 27844 1726882778.05091: done queuing things up, now waiting for results queue to drain 27844 1726882778.05092: waiting for pending results... 27844 1726882778.05381: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 27844 1726882778.05504: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b79 27844 1726882778.05524: variable 'ansible_search_path' from source: unknown 27844 1726882778.05536: variable 'ansible_search_path' from source: unknown 27844 1726882778.05579: calling self._execute() 27844 1726882778.05687: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.05697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.05710: variable 'omit' from source: magic vars 27844 1726882778.06091: variable 'ansible_distribution_major_version' from source: facts 27844 1726882778.06107: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882778.06119: variable 'omit' from source: magic vars 27844 1726882778.06177: variable 'omit' from source: magic vars 27844 1726882778.06219: variable 'omit' from source: magic vars 27844 1726882778.06271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882778.06320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882778.06345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882778.06373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.06393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.06433: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882778.06442: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.06450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.06558: Set connection var ansible_shell_type to sh 27844 1726882778.06572: Set connection var ansible_connection to ssh 27844 1726882778.06585: Set connection var ansible_pipelining to False 27844 1726882778.06593: Set connection var ansible_timeout to 10 27844 1726882778.06600: Set connection var ansible_shell_executable to /bin/sh 27844 1726882778.06607: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882778.06637: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.06643: variable 'ansible_connection' from source: unknown 27844 1726882778.06649: variable 'ansible_module_compression' from source: unknown 27844 1726882778.06654: variable 'ansible_shell_type' from source: unknown 27844 1726882778.06659: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.06669: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.06677: variable 'ansible_pipelining' from source: unknown 27844 1726882778.06682: variable 'ansible_timeout' from source: unknown 27844 1726882778.06688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.06833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882778.06857: variable 'omit' from source: magic vars 27844 1726882778.06872: starting attempt loop 27844 1726882778.06879: running the handler 27844 1726882778.06895: handler run complete 27844 1726882778.06910: attempt loop complete, returning result 27844 1726882778.06917: _execute() done 27844 1726882778.06923: dumping result to json 27844 1726882778.06929: done dumping result, returning 27844 1726882778.06939: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-efa9-466a-000000000b79] 27844 1726882778.06952: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b79 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27844 1726882778.07117: no more pending results, returning what we have 27844 1726882778.07121: results queue empty 27844 1726882778.07122: checking for any_errors_fatal 27844 1726882778.07124: done checking for any_errors_fatal 27844 1726882778.07125: checking for max_fail_percentage 27844 1726882778.07127: done checking for max_fail_percentage 27844 1726882778.07128: checking to see if all hosts have failed and the running result is not ok 27844 1726882778.07129: done checking to see if all hosts have failed 27844 1726882778.07129: getting the remaining hosts for this loop 27844 1726882778.07131: done getting the remaining hosts for this loop 27844 1726882778.07134: getting the next task for host managed_node1 27844 1726882778.07142: done getting next task for host managed_node1 27844 1726882778.07144: ^ task is: TASK: Stat profile file 27844 1726882778.07152: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882778.07156: getting variables 27844 1726882778.07158: in VariableManager get_vars() 27844 1726882778.07206: Calling all_inventory to load vars for managed_node1 27844 1726882778.07209: Calling groups_inventory to load vars for managed_node1 27844 1726882778.07212: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882778.07226: Calling all_plugins_play to load vars for managed_node1 27844 1726882778.07229: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882778.07232: Calling groups_plugins_play to load vars for managed_node1 27844 1726882778.08491: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b79 27844 1726882778.08495: WORKER PROCESS EXITING 27844 1726882778.09097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882778.10352: done with get_vars() 27844 1726882778.10372: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:39:38 -0400 (0:00:00.056) 0:00:37.181 ****** 27844 1726882778.10440: entering _queue_task() for managed_node1/stat 27844 1726882778.10659: worker is 1 (out of 1 available) 27844 1726882778.10677: exiting _queue_task() for managed_node1/stat 27844 1726882778.10690: done queuing things up, now waiting for results queue to drain 27844 1726882778.10692: waiting for pending results... 27844 1726882778.10877: running TaskExecutor() for managed_node1/TASK: Stat profile file 27844 1726882778.10952: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b7a 27844 1726882778.10964: variable 'ansible_search_path' from source: unknown 27844 1726882778.10970: variable 'ansible_search_path' from source: unknown 27844 1726882778.10999: calling self._execute() 27844 1726882778.11084: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.11088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.11099: variable 'omit' from source: magic vars 27844 1726882778.11373: variable 'ansible_distribution_major_version' from source: facts 27844 1726882778.11384: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882778.11390: variable 'omit' from source: magic vars 27844 1726882778.11427: variable 'omit' from source: magic vars 27844 1726882778.11507: variable 'profile' from source: include params 27844 1726882778.11511: variable 'item' from source: include params 27844 1726882778.11811: variable 'item' from source: include params 27844 1726882778.11816: variable 'omit' from source: magic vars 27844 1726882778.11820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882778.11823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882778.11826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882778.11829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.11832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.11835: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882778.11839: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.11842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.11878: Set connection var ansible_shell_type to sh 27844 1726882778.11887: Set connection var ansible_connection to ssh 27844 1726882778.11899: Set connection var ansible_pipelining to False 27844 1726882778.11908: Set connection var ansible_timeout to 10 27844 1726882778.11917: Set connection var ansible_shell_executable to /bin/sh 27844 1726882778.11927: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882778.11963: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.11978: variable 'ansible_connection' from source: unknown 27844 1726882778.11990: variable 'ansible_module_compression' from source: unknown 27844 1726882778.11997: variable 'ansible_shell_type' from source: unknown 27844 1726882778.12003: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.12010: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.12017: variable 'ansible_pipelining' from source: unknown 27844 1726882778.12028: variable 'ansible_timeout' from source: unknown 27844 1726882778.12034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.12237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 27844 1726882778.12251: variable 'omit' from source: magic vars 27844 1726882778.12259: starting attempt loop 27844 1726882778.12270: running the handler 27844 1726882778.12292: _low_level_execute_command(): starting 27844 1726882778.12305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882778.12990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.12994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.13023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.13026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.13029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.13081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.13094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.13203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.14878: stdout chunk (state=3): >>>/root <<< 27844 1726882778.15283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.15286: stdout chunk (state=3): >>><<< 27844 1726882778.15288: stderr chunk (state=3): >>><<< 27844 1726882778.15291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.15295: _low_level_execute_command(): starting 27844 1726882778.15297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127 `" && echo ansible-tmp-1726882778.1505682-29538-53550146274127="` echo /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127 `" ) && sleep 0' 27844 1726882778.15662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.15673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.15687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.15702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.15737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.15744: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.15754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.15770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.15777: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.15784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.15797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.15807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.15817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.15824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.15832: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.15841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.15919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.15936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.15949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.16072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.17929: stdout chunk (state=3): >>>ansible-tmp-1726882778.1505682-29538-53550146274127=/root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127 <<< 27844 1726882778.18041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.18093: stderr chunk (state=3): >>><<< 27844 1726882778.18098: stdout chunk (state=3): >>><<< 27844 1726882778.18113: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882778.1505682-29538-53550146274127=/root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.18147: variable 'ansible_module_compression' from source: unknown 27844 1726882778.18196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27844 1726882778.18239: variable 'ansible_facts' from source: unknown 27844 1726882778.18303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/AnsiballZ_stat.py 27844 1726882778.18450: Sending initial data 27844 1726882778.18454: Sent initial data (152 bytes) 27844 1726882778.19432: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.19435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.19438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.19440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.19442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.19445: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.19447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.19449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.19451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.19453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.19455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.19457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.19462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.19466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.19482: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.19488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.19556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.19660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.21367: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882778.21450: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882778.21541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp0gilg8ra /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/AnsiballZ_stat.py <<< 27844 1726882778.21631: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882778.22624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.22746: stderr chunk (state=3): >>><<< 27844 1726882778.22749: stdout chunk (state=3): >>><<< 27844 1726882778.22774: done transferring module to remote 27844 1726882778.22784: _low_level_execute_command(): starting 27844 1726882778.22789: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/ /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/AnsiballZ_stat.py && sleep 0' 27844 1726882778.23445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.23450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.23500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.23503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.23505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.23566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.23571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.23669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.25385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.25425: stderr chunk (state=3): >>><<< 27844 1726882778.25428: stdout chunk (state=3): >>><<< 27844 1726882778.25439: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.25448: _low_level_execute_command(): starting 27844 1726882778.25451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/AnsiballZ_stat.py && sleep 0' 27844 1726882778.25837: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.25843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.25893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.25896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.25898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.25900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.25949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.25953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.26070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.39305: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27844 1726882778.40296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882778.40299: stdout chunk (state=3): >>><<< 27844 1726882778.40302: stderr chunk (state=3): >>><<< 27844 1726882778.40434: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882778.40439: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882778.40443: _low_level_execute_command(): starting 27844 1726882778.40445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882778.1505682-29538-53550146274127/ > /dev/null 2>&1 && sleep 0' 27844 1726882778.41117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.41131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.41146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.41170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.41211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.41227: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.41239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.41254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.41269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.41280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.41291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.41303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.41316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.41326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.41336: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.41347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.41422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.41437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.41450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.42187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.44063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.44071: stdout chunk (state=3): >>><<< 27844 1726882778.44073: stderr chunk (state=3): >>><<< 27844 1726882778.44475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.44479: handler run complete 27844 1726882778.44481: attempt loop complete, returning result 27844 1726882778.44484: _execute() done 27844 1726882778.44486: dumping result to json 27844 1726882778.44488: done dumping result, returning 27844 1726882778.44490: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-efa9-466a-000000000b7a] 27844 1726882778.44492: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7a 27844 1726882778.44569: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7a ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 27844 1726882778.44624: no more pending results, returning what we have 27844 1726882778.44628: results queue empty 27844 1726882778.44629: checking for any_errors_fatal 27844 1726882778.44635: done checking for any_errors_fatal 27844 1726882778.44635: checking for max_fail_percentage 27844 1726882778.44637: done checking for max_fail_percentage 27844 1726882778.44638: checking to see if all hosts have failed and the running result is not ok 27844 1726882778.44639: done checking to see if all hosts have failed 27844 1726882778.44639: getting the remaining hosts for this loop 27844 1726882778.44641: done getting the remaining hosts for this loop 27844 1726882778.44644: getting the next task for host managed_node1 27844 1726882778.44650: done getting next task for host managed_node1 27844 1726882778.44653: ^ task is: TASK: Set NM profile exist flag based on the profile files 27844 1726882778.44661: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882778.44668: getting variables 27844 1726882778.44669: in VariableManager get_vars() 27844 1726882778.44712: Calling all_inventory to load vars for managed_node1 27844 1726882778.44715: Calling groups_inventory to load vars for managed_node1 27844 1726882778.44718: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882778.44729: Calling all_plugins_play to load vars for managed_node1 27844 1726882778.44732: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882778.44736: Calling groups_plugins_play to load vars for managed_node1 27844 1726882778.45254: WORKER PROCESS EXITING 27844 1726882778.47505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882778.50316: done with get_vars() 27844 1726882778.50339: done getting variables 27844 1726882778.51604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:39:38 -0400 (0:00:00.411) 0:00:37.593 ****** 27844 1726882778.51641: entering _queue_task() for managed_node1/set_fact 27844 1726882778.51978: worker is 1 (out of 1 available) 27844 1726882778.51993: exiting _queue_task() for managed_node1/set_fact 27844 1726882778.52007: done queuing things up, now waiting for results queue to drain 27844 1726882778.52009: waiting for pending results... 27844 1726882778.53045: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 27844 1726882778.53177: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b7b 27844 1726882778.53197: variable 'ansible_search_path' from source: unknown 27844 1726882778.53204: variable 'ansible_search_path' from source: unknown 27844 1726882778.53245: calling self._execute() 27844 1726882778.53975: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.53987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.54001: variable 'omit' from source: magic vars 27844 1726882778.54371: variable 'ansible_distribution_major_version' from source: facts 27844 1726882778.54389: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882778.54516: variable 'profile_stat' from source: set_fact 27844 1726882778.55183: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882778.55190: when evaluation is False, skipping this task 27844 1726882778.55196: _execute() done 27844 1726882778.55203: dumping result to json 27844 1726882778.55210: done dumping result, returning 27844 1726882778.55221: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-efa9-466a-000000000b7b] 27844 1726882778.55230: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7b skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882778.55380: no more pending results, returning what we have 27844 1726882778.55385: results queue empty 27844 1726882778.55386: checking for any_errors_fatal 27844 1726882778.55397: done checking for any_errors_fatal 27844 1726882778.55398: checking for max_fail_percentage 27844 1726882778.55399: done checking for max_fail_percentage 27844 1726882778.55400: checking to see if all hosts have failed and the running result is not ok 27844 1726882778.55401: done checking to see if all hosts have failed 27844 1726882778.55402: getting the remaining hosts for this loop 27844 1726882778.55403: done getting the remaining hosts for this loop 27844 1726882778.55407: getting the next task for host managed_node1 27844 1726882778.55413: done getting next task for host managed_node1 27844 1726882778.55415: ^ task is: TASK: Get NM profile info 27844 1726882778.55421: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882778.55425: getting variables 27844 1726882778.55427: in VariableManager get_vars() 27844 1726882778.55470: Calling all_inventory to load vars for managed_node1 27844 1726882778.55473: Calling groups_inventory to load vars for managed_node1 27844 1726882778.55476: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882778.55492: Calling all_plugins_play to load vars for managed_node1 27844 1726882778.55494: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882778.55498: Calling groups_plugins_play to load vars for managed_node1 27844 1726882778.57773: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7b 27844 1726882778.57778: WORKER PROCESS EXITING 27844 1726882778.58172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882778.61261: done with get_vars() 27844 1726882778.61295: done getting variables 27844 1726882778.61356: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:39:38 -0400 (0:00:00.097) 0:00:37.690 ****** 27844 1726882778.61392: entering _queue_task() for managed_node1/shell 27844 1726882778.61738: worker is 1 (out of 1 available) 27844 1726882778.61752: exiting _queue_task() for managed_node1/shell 27844 1726882778.62967: done queuing things up, now waiting for results queue to drain 27844 1726882778.62969: waiting for pending results... 27844 1726882778.62991: running TaskExecutor() for managed_node1/TASK: Get NM profile info 27844 1726882778.63088: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b7c 27844 1726882778.63109: variable 'ansible_search_path' from source: unknown 27844 1726882778.63115: variable 'ansible_search_path' from source: unknown 27844 1726882778.63157: calling self._execute() 27844 1726882778.63270: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.63979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.63994: variable 'omit' from source: magic vars 27844 1726882778.64372: variable 'ansible_distribution_major_version' from source: facts 27844 1726882778.64390: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882778.64401: variable 'omit' from source: magic vars 27844 1726882778.64457: variable 'omit' from source: magic vars 27844 1726882778.64561: variable 'profile' from source: include params 27844 1726882778.64577: variable 'item' from source: include params 27844 1726882778.64643: variable 'item' from source: include params 27844 1726882778.65290: variable 'omit' from source: magic vars 27844 1726882778.65337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882778.65381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882778.65406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882778.65427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.65443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882778.65482: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882778.65490: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.65498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.65603: Set connection var ansible_shell_type to sh 27844 1726882778.65610: Set connection var ansible_connection to ssh 27844 1726882778.65621: Set connection var ansible_pipelining to False 27844 1726882778.65631: Set connection var ansible_timeout to 10 27844 1726882778.65641: Set connection var ansible_shell_executable to /bin/sh 27844 1726882778.65651: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882778.65690: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.65698: variable 'ansible_connection' from source: unknown 27844 1726882778.65705: variable 'ansible_module_compression' from source: unknown 27844 1726882778.65711: variable 'ansible_shell_type' from source: unknown 27844 1726882778.65717: variable 'ansible_shell_executable' from source: unknown 27844 1726882778.65724: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882778.65732: variable 'ansible_pipelining' from source: unknown 27844 1726882778.65738: variable 'ansible_timeout' from source: unknown 27844 1726882778.65746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882778.65885: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882778.65901: variable 'omit' from source: magic vars 27844 1726882778.65910: starting attempt loop 27844 1726882778.65916: running the handler 27844 1726882778.65929: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882778.65949: _low_level_execute_command(): starting 27844 1726882778.65960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882778.67483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.68182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.68197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.68215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.68257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.68275: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.68290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.68307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.68317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.68328: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.68339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.68352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.68374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.68388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.68399: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.68412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.68491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.68508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.68525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.68660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.70321: stdout chunk (state=3): >>>/root <<< 27844 1726882778.70505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.70508: stdout chunk (state=3): >>><<< 27844 1726882778.70511: stderr chunk (state=3): >>><<< 27844 1726882778.70616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.70628: _low_level_execute_command(): starting 27844 1726882778.70631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960 `" && echo ansible-tmp-1726882778.7052972-29572-74283220445960="` echo /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960 `" ) && sleep 0' 27844 1726882778.72126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.72139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.72153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.72175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.72215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.72227: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.72239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.72254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.72268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.72281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.72294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.72309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.72324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.72337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.72348: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.72362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.72443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.72473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.72490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.72616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.74471: stdout chunk (state=3): >>>ansible-tmp-1726882778.7052972-29572-74283220445960=/root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960 <<< 27844 1726882778.74626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.74656: stderr chunk (state=3): >>><<< 27844 1726882778.74660: stdout chunk (state=3): >>><<< 27844 1726882778.74971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882778.7052972-29572-74283220445960=/root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.74975: variable 'ansible_module_compression' from source: unknown 27844 1726882778.74977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882778.74979: variable 'ansible_facts' from source: unknown 27844 1726882778.74981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/AnsiballZ_command.py 27844 1726882778.75528: Sending initial data 27844 1726882778.75531: Sent initial data (155 bytes) 27844 1726882778.76543: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.76547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.76588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882778.76592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.76595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.76597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.76670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.76678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.76682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.76776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.78508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 27844 1726882778.78512: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882778.78594: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882778.78694: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmp00cqfj9y /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/AnsiballZ_command.py <<< 27844 1726882778.78789: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882778.80270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.80471: stderr chunk (state=3): >>><<< 27844 1726882778.80475: stdout chunk (state=3): >>><<< 27844 1726882778.80477: done transferring module to remote 27844 1726882778.80480: _low_level_execute_command(): starting 27844 1726882778.80486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/ /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/AnsiballZ_command.py && sleep 0' 27844 1726882778.81251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.81254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.81295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 27844 1726882778.81298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.81301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 27844 1726882778.81303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.81375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.81382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.81488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882778.83271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882778.83291: stdout chunk (state=3): >>><<< 27844 1726882778.83294: stderr chunk (state=3): >>><<< 27844 1726882778.83381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882778.83385: _low_level_execute_command(): starting 27844 1726882778.83387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/AnsiballZ_command.py && sleep 0' 27844 1726882778.84709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882778.84884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.84898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.84915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.84962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.84980: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882778.84993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.85009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882778.85019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882778.85028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882778.85044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882778.85061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882778.85086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882778.85183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882778.85196: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882778.85209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882778.85290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882778.85482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882778.85499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882778.85629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882779.00381: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 21:39:38.984895", "end": "2024-09-20 21:39:39.002250", "delta": "0:00:00.017355", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882779.01609: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 27844 1726882779.01613: stdout chunk (state=3): >>><<< 27844 1726882779.01616: stderr chunk (state=3): >>><<< 27844 1726882779.01673: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-20 21:39:38.984895", "end": "2024-09-20 21:39:39.002250", "delta": "0:00:00.017355", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 27844 1726882779.01777: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882779.01781: _low_level_execute_command(): starting 27844 1726882779.01784: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882778.7052972-29572-74283220445960/ > /dev/null 2>&1 && sleep 0' 27844 1726882779.02411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882779.02426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.02447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.02470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.02515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.02528: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882779.02542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.02572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882779.02585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882779.02597: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882779.02609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.02623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.02640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.02652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.02674: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882779.02690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.02770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882779.02798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882779.02815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882779.02940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882779.04753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882779.04849: stderr chunk (state=3): >>><<< 27844 1726882779.04860: stdout chunk (state=3): >>><<< 27844 1726882779.05178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882779.05181: handler run complete 27844 1726882779.05184: Evaluated conditional (False): False 27844 1726882779.05186: attempt loop complete, returning result 27844 1726882779.05187: _execute() done 27844 1726882779.05189: dumping result to json 27844 1726882779.05191: done dumping result, returning 27844 1726882779.05193: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-efa9-466a-000000000b7c] 27844 1726882779.05195: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7c 27844 1726882779.05271: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7c 27844 1726882779.05275: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.017355", "end": "2024-09-20 21:39:39.002250", "rc": 1, "start": "2024-09-20 21:39:38.984895" } MSG: non-zero return code ...ignoring 27844 1726882779.05356: no more pending results, returning what we have 27844 1726882779.05360: results queue empty 27844 1726882779.05361: checking for any_errors_fatal 27844 1726882779.05372: done checking for any_errors_fatal 27844 1726882779.05373: checking for max_fail_percentage 27844 1726882779.05374: done checking for max_fail_percentage 27844 1726882779.05375: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.05376: done checking to see if all hosts have failed 27844 1726882779.05377: getting the remaining hosts for this loop 27844 1726882779.05379: done getting the remaining hosts for this loop 27844 1726882779.05382: getting the next task for host managed_node1 27844 1726882779.05389: done getting next task for host managed_node1 27844 1726882779.05392: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27844 1726882779.05397: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.05400: getting variables 27844 1726882779.05402: in VariableManager get_vars() 27844 1726882779.05448: Calling all_inventory to load vars for managed_node1 27844 1726882779.05451: Calling groups_inventory to load vars for managed_node1 27844 1726882779.05454: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.05475: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.05479: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.05482: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.07279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.09070: done with get_vars() 27844 1726882779.09093: done getting variables 27844 1726882779.09155: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:39:39 -0400 (0:00:00.477) 0:00:38.168 ****** 27844 1726882779.09195: entering _queue_task() for managed_node1/set_fact 27844 1726882779.09533: worker is 1 (out of 1 available) 27844 1726882779.09546: exiting _queue_task() for managed_node1/set_fact 27844 1726882779.09558: done queuing things up, now waiting for results queue to drain 27844 1726882779.09560: waiting for pending results... 27844 1726882779.09862: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27844 1726882779.10001: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b7d 27844 1726882779.10029: variable 'ansible_search_path' from source: unknown 27844 1726882779.10038: variable 'ansible_search_path' from source: unknown 27844 1726882779.10085: calling self._execute() 27844 1726882779.10200: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.10211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.10229: variable 'omit' from source: magic vars 27844 1726882779.10627: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.10645: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.10801: variable 'nm_profile_exists' from source: set_fact 27844 1726882779.10819: Evaluated conditional (nm_profile_exists.rc == 0): False 27844 1726882779.10826: when evaluation is False, skipping this task 27844 1726882779.10832: _execute() done 27844 1726882779.10839: dumping result to json 27844 1726882779.10846: done dumping result, returning 27844 1726882779.10857: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-efa9-466a-000000000b7d] 27844 1726882779.10871: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7d skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27844 1726882779.11033: no more pending results, returning what we have 27844 1726882779.11037: results queue empty 27844 1726882779.11038: checking for any_errors_fatal 27844 1726882779.11048: done checking for any_errors_fatal 27844 1726882779.11049: checking for max_fail_percentage 27844 1726882779.11051: done checking for max_fail_percentage 27844 1726882779.11052: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.11053: done checking to see if all hosts have failed 27844 1726882779.11053: getting the remaining hosts for this loop 27844 1726882779.11055: done getting the remaining hosts for this loop 27844 1726882779.11059: getting the next task for host managed_node1 27844 1726882779.11072: done getting next task for host managed_node1 27844 1726882779.11076: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27844 1726882779.11082: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.11088: getting variables 27844 1726882779.11090: in VariableManager get_vars() 27844 1726882779.11133: Calling all_inventory to load vars for managed_node1 27844 1726882779.11136: Calling groups_inventory to load vars for managed_node1 27844 1726882779.11138: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.11152: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.11156: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.11159: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.12232: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7d 27844 1726882779.12235: WORKER PROCESS EXITING 27844 1726882779.12951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.14834: done with get_vars() 27844 1726882779.14856: done getting variables 27844 1726882779.14925: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882779.15048: variable 'profile' from source: include params 27844 1726882779.15052: variable 'item' from source: include params 27844 1726882779.15124: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:39:39 -0400 (0:00:00.059) 0:00:38.228 ****** 27844 1726882779.15158: entering _queue_task() for managed_node1/command 27844 1726882779.15487: worker is 1 (out of 1 available) 27844 1726882779.15502: exiting _queue_task() for managed_node1/command 27844 1726882779.15515: done queuing things up, now waiting for results queue to drain 27844 1726882779.15516: waiting for pending results... 27844 1726882779.15827: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 27844 1726882779.15956: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b7f 27844 1726882779.15986: variable 'ansible_search_path' from source: unknown 27844 1726882779.15994: variable 'ansible_search_path' from source: unknown 27844 1726882779.16035: calling self._execute() 27844 1726882779.16146: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.16159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.16185: variable 'omit' from source: magic vars 27844 1726882779.16560: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.16576: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.16685: variable 'profile_stat' from source: set_fact 27844 1726882779.16693: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882779.16696: when evaluation is False, skipping this task 27844 1726882779.16699: _execute() done 27844 1726882779.16701: dumping result to json 27844 1726882779.16703: done dumping result, returning 27844 1726882779.16710: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [0e448fcc-3ce9-efa9-466a-000000000b7f] 27844 1726882779.16715: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7f 27844 1726882779.16802: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b7f 27844 1726882779.16805: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882779.16887: no more pending results, returning what we have 27844 1726882779.16890: results queue empty 27844 1726882779.16891: checking for any_errors_fatal 27844 1726882779.16896: done checking for any_errors_fatal 27844 1726882779.16897: checking for max_fail_percentage 27844 1726882779.16898: done checking for max_fail_percentage 27844 1726882779.16899: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.16900: done checking to see if all hosts have failed 27844 1726882779.16900: getting the remaining hosts for this loop 27844 1726882779.16902: done getting the remaining hosts for this loop 27844 1726882779.16905: getting the next task for host managed_node1 27844 1726882779.16910: done getting next task for host managed_node1 27844 1726882779.16913: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27844 1726882779.16919: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.16922: getting variables 27844 1726882779.16923: in VariableManager get_vars() 27844 1726882779.16956: Calling all_inventory to load vars for managed_node1 27844 1726882779.16959: Calling groups_inventory to load vars for managed_node1 27844 1726882779.16961: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.16978: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.16980: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.16983: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.17765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.19006: done with get_vars() 27844 1726882779.19028: done getting variables 27844 1726882779.19089: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882779.19194: variable 'profile' from source: include params 27844 1726882779.19198: variable 'item' from source: include params 27844 1726882779.19276: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:39:39 -0400 (0:00:00.041) 0:00:38.269 ****** 27844 1726882779.19309: entering _queue_task() for managed_node1/set_fact 27844 1726882779.19609: worker is 1 (out of 1 available) 27844 1726882779.19623: exiting _queue_task() for managed_node1/set_fact 27844 1726882779.19637: done queuing things up, now waiting for results queue to drain 27844 1726882779.19639: waiting for pending results... 27844 1726882779.19919: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 27844 1726882779.20007: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b80 27844 1726882779.20018: variable 'ansible_search_path' from source: unknown 27844 1726882779.20021: variable 'ansible_search_path' from source: unknown 27844 1726882779.20070: calling self._execute() 27844 1726882779.20152: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.20156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.20166: variable 'omit' from source: magic vars 27844 1726882779.20621: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.20634: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.20804: variable 'profile_stat' from source: set_fact 27844 1726882779.20816: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882779.20819: when evaluation is False, skipping this task 27844 1726882779.20821: _execute() done 27844 1726882779.20824: dumping result to json 27844 1726882779.20826: done dumping result, returning 27844 1726882779.20836: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [0e448fcc-3ce9-efa9-466a-000000000b80] 27844 1726882779.20852: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b80 27844 1726882779.20968: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b80 27844 1726882779.20972: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882779.21025: no more pending results, returning what we have 27844 1726882779.21029: results queue empty 27844 1726882779.21033: checking for any_errors_fatal 27844 1726882779.21044: done checking for any_errors_fatal 27844 1726882779.21047: checking for max_fail_percentage 27844 1726882779.21049: done checking for max_fail_percentage 27844 1726882779.21050: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.21051: done checking to see if all hosts have failed 27844 1726882779.21051: getting the remaining hosts for this loop 27844 1726882779.21053: done getting the remaining hosts for this loop 27844 1726882779.21056: getting the next task for host managed_node1 27844 1726882779.21065: done getting next task for host managed_node1 27844 1726882779.21070: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27844 1726882779.21075: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.21080: getting variables 27844 1726882779.21082: in VariableManager get_vars() 27844 1726882779.21118: Calling all_inventory to load vars for managed_node1 27844 1726882779.21120: Calling groups_inventory to load vars for managed_node1 27844 1726882779.21122: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.21132: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.21134: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.21136: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.25462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.29681: done with get_vars() 27844 1726882779.29844: done getting variables 27844 1726882779.29913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882779.30295: variable 'profile' from source: include params 27844 1726882779.30300: variable 'item' from source: include params 27844 1726882779.30473: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:39:39 -0400 (0:00:00.112) 0:00:38.381 ****** 27844 1726882779.30518: entering _queue_task() for managed_node1/command 27844 1726882779.31421: worker is 1 (out of 1 available) 27844 1726882779.31445: exiting _queue_task() for managed_node1/command 27844 1726882779.31581: done queuing things up, now waiting for results queue to drain 27844 1726882779.31584: waiting for pending results... 27844 1726882779.32131: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 27844 1726882779.32249: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b81 27844 1726882779.32262: variable 'ansible_search_path' from source: unknown 27844 1726882779.32267: variable 'ansible_search_path' from source: unknown 27844 1726882779.32307: calling self._execute() 27844 1726882779.32410: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.32414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.32424: variable 'omit' from source: magic vars 27844 1726882779.33250: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.33332: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.33760: variable 'profile_stat' from source: set_fact 27844 1726882779.33775: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882779.33780: when evaluation is False, skipping this task 27844 1726882779.33787: _execute() done 27844 1726882779.33791: dumping result to json 27844 1726882779.33795: done dumping result, returning 27844 1726882779.33805: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 [0e448fcc-3ce9-efa9-466a-000000000b81] 27844 1726882779.33811: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b81 27844 1726882779.33941: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b81 27844 1726882779.33944: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882779.34004: no more pending results, returning what we have 27844 1726882779.34008: results queue empty 27844 1726882779.34009: checking for any_errors_fatal 27844 1726882779.34019: done checking for any_errors_fatal 27844 1726882779.34020: checking for max_fail_percentage 27844 1726882779.34021: done checking for max_fail_percentage 27844 1726882779.34022: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.34023: done checking to see if all hosts have failed 27844 1726882779.34024: getting the remaining hosts for this loop 27844 1726882779.34025: done getting the remaining hosts for this loop 27844 1726882779.34029: getting the next task for host managed_node1 27844 1726882779.34035: done getting next task for host managed_node1 27844 1726882779.34038: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27844 1726882779.34043: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.34047: getting variables 27844 1726882779.34049: in VariableManager get_vars() 27844 1726882779.34095: Calling all_inventory to load vars for managed_node1 27844 1726882779.34098: Calling groups_inventory to load vars for managed_node1 27844 1726882779.34100: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.34112: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.34115: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.34118: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.35642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.37917: done with get_vars() 27844 1726882779.37938: done getting variables 27844 1726882779.38006: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882779.38119: variable 'profile' from source: include params 27844 1726882779.38123: variable 'item' from source: include params 27844 1726882779.38182: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:39:39 -0400 (0:00:00.077) 0:00:38.458 ****** 27844 1726882779.38221: entering _queue_task() for managed_node1/set_fact 27844 1726882779.38539: worker is 1 (out of 1 available) 27844 1726882779.38551: exiting _queue_task() for managed_node1/set_fact 27844 1726882779.38565: done queuing things up, now waiting for results queue to drain 27844 1726882779.38567: waiting for pending results... 27844 1726882779.39105: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 27844 1726882779.39255: in run() - task 0e448fcc-3ce9-efa9-466a-000000000b82 27844 1726882779.39278: variable 'ansible_search_path' from source: unknown 27844 1726882779.39294: variable 'ansible_search_path' from source: unknown 27844 1726882779.39339: calling self._execute() 27844 1726882779.39456: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.39469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.39486: variable 'omit' from source: magic vars 27844 1726882779.39886: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.39903: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.40093: variable 'profile_stat' from source: set_fact 27844 1726882779.40111: Evaluated conditional (profile_stat.stat.exists): False 27844 1726882779.40170: when evaluation is False, skipping this task 27844 1726882779.40179: _execute() done 27844 1726882779.40186: dumping result to json 27844 1726882779.40193: done dumping result, returning 27844 1726882779.40204: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [0e448fcc-3ce9-efa9-466a-000000000b82] 27844 1726882779.40215: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b82 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27844 1726882779.40359: no more pending results, returning what we have 27844 1726882779.40366: results queue empty 27844 1726882779.40368: checking for any_errors_fatal 27844 1726882779.40375: done checking for any_errors_fatal 27844 1726882779.40376: checking for max_fail_percentage 27844 1726882779.40378: done checking for max_fail_percentage 27844 1726882779.40379: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.40380: done checking to see if all hosts have failed 27844 1726882779.40381: getting the remaining hosts for this loop 27844 1726882779.40383: done getting the remaining hosts for this loop 27844 1726882779.40386: getting the next task for host managed_node1 27844 1726882779.40394: done getting next task for host managed_node1 27844 1726882779.40397: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27844 1726882779.40402: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.40407: getting variables 27844 1726882779.40409: in VariableManager get_vars() 27844 1726882779.40454: Calling all_inventory to load vars for managed_node1 27844 1726882779.40457: Calling groups_inventory to load vars for managed_node1 27844 1726882779.40460: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.40476: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.40479: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.40483: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.41595: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000b82 27844 1726882779.41599: WORKER PROCESS EXITING 27844 1726882779.42427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.46180: done with get_vars() 27844 1726882779.46205: done getting variables 27844 1726882779.46266: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 27844 1726882779.46762: variable 'profile' from source: include params 27844 1726882779.46768: variable 'item' from source: include params 27844 1726882779.46827: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:39:39 -0400 (0:00:00.087) 0:00:38.546 ****** 27844 1726882779.46980: entering _queue_task() for managed_node1/assert 27844 1726882779.47654: worker is 1 (out of 1 available) 27844 1726882779.47669: exiting _queue_task() for managed_node1/assert 27844 1726882779.47682: done queuing things up, now waiting for results queue to drain 27844 1726882779.47684: waiting for pending results... 27844 1726882779.48689: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest1' 27844 1726882779.48694: in run() - task 0e448fcc-3ce9-efa9-466a-000000000a72 27844 1726882779.48699: variable 'ansible_search_path' from source: unknown 27844 1726882779.48701: variable 'ansible_search_path' from source: unknown 27844 1726882779.48704: calling self._execute() 27844 1726882779.49187: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.49192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.49203: variable 'omit' from source: magic vars 27844 1726882779.50065: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.50079: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.50089: variable 'omit' from source: magic vars 27844 1726882779.50134: variable 'omit' from source: magic vars 27844 1726882779.50235: variable 'profile' from source: include params 27844 1726882779.50240: variable 'item' from source: include params 27844 1726882779.50306: variable 'item' from source: include params 27844 1726882779.50323: variable 'omit' from source: magic vars 27844 1726882779.50367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882779.50711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882779.50731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882779.50749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882779.50761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882779.50795: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882779.50799: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.50801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.50902: Set connection var ansible_shell_type to sh 27844 1726882779.50905: Set connection var ansible_connection to ssh 27844 1726882779.50911: Set connection var ansible_pipelining to False 27844 1726882779.50916: Set connection var ansible_timeout to 10 27844 1726882779.50922: Set connection var ansible_shell_executable to /bin/sh 27844 1726882779.50927: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882779.50952: variable 'ansible_shell_executable' from source: unknown 27844 1726882779.50955: variable 'ansible_connection' from source: unknown 27844 1726882779.50957: variable 'ansible_module_compression' from source: unknown 27844 1726882779.50960: variable 'ansible_shell_type' from source: unknown 27844 1726882779.50962: variable 'ansible_shell_executable' from source: unknown 27844 1726882779.50966: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.51377: variable 'ansible_pipelining' from source: unknown 27844 1726882779.51381: variable 'ansible_timeout' from source: unknown 27844 1726882779.51384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.51517: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882779.51531: variable 'omit' from source: magic vars 27844 1726882779.51536: starting attempt loop 27844 1726882779.51539: running the handler 27844 1726882779.51660: variable 'lsr_net_profile_exists' from source: set_fact 27844 1726882779.51666: Evaluated conditional (not lsr_net_profile_exists): True 27844 1726882779.51676: handler run complete 27844 1726882779.51691: attempt loop complete, returning result 27844 1726882779.51694: _execute() done 27844 1726882779.51697: dumping result to json 27844 1726882779.51700: done dumping result, returning 27844 1726882779.51706: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'ethtest1' [0e448fcc-3ce9-efa9-466a-000000000a72] 27844 1726882779.51711: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a72 27844 1726882779.51806: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000a72 27844 1726882779.51810: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 27844 1726882779.51867: no more pending results, returning what we have 27844 1726882779.51872: results queue empty 27844 1726882779.51873: checking for any_errors_fatal 27844 1726882779.51884: done checking for any_errors_fatal 27844 1726882779.51885: checking for max_fail_percentage 27844 1726882779.51887: done checking for max_fail_percentage 27844 1726882779.51888: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.51889: done checking to see if all hosts have failed 27844 1726882779.51890: getting the remaining hosts for this loop 27844 1726882779.51892: done getting the remaining hosts for this loop 27844 1726882779.51895: getting the next task for host managed_node1 27844 1726882779.51905: done getting next task for host managed_node1 27844 1726882779.51908: ^ task is: TASK: Verify network state restored to default 27844 1726882779.51912: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.51919: getting variables 27844 1726882779.51921: in VariableManager get_vars() 27844 1726882779.51967: Calling all_inventory to load vars for managed_node1 27844 1726882779.51971: Calling groups_inventory to load vars for managed_node1 27844 1726882779.51974: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.51986: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.51989: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.51992: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.56192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.62930: done with get_vars() 27844 1726882779.63071: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Friday 20 September 2024 21:39:39 -0400 (0:00:00.161) 0:00:38.708 ****** 27844 1726882779.63292: entering _queue_task() for managed_node1/include_tasks 27844 1726882779.63952: worker is 1 (out of 1 available) 27844 1726882779.63965: exiting _queue_task() for managed_node1/include_tasks 27844 1726882779.63976: done queuing things up, now waiting for results queue to drain 27844 1726882779.63977: waiting for pending results... 27844 1726882779.64876: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 27844 1726882779.65100: in run() - task 0e448fcc-3ce9-efa9-466a-0000000000bb 27844 1726882779.65120: variable 'ansible_search_path' from source: unknown 27844 1726882779.65178: calling self._execute() 27844 1726882779.65340: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.65471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.65488: variable 'omit' from source: magic vars 27844 1726882779.66191: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.66342: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.66353: _execute() done 27844 1726882779.66361: dumping result to json 27844 1726882779.66371: done dumping result, returning 27844 1726882779.66382: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-efa9-466a-0000000000bb] 27844 1726882779.66392: sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000bb 27844 1726882779.66523: no more pending results, returning what we have 27844 1726882779.66529: in VariableManager get_vars() 27844 1726882779.66582: Calling all_inventory to load vars for managed_node1 27844 1726882779.66585: Calling groups_inventory to load vars for managed_node1 27844 1726882779.66588: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.66605: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.66608: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.66611: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.67960: done sending task result for task 0e448fcc-3ce9-efa9-466a-0000000000bb 27844 1726882779.67966: WORKER PROCESS EXITING 27844 1726882779.69338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.72973: done with get_vars() 27844 1726882779.72994: variable 'ansible_search_path' from source: unknown 27844 1726882779.73008: we have included files to process 27844 1726882779.73009: generating all_blocks data 27844 1726882779.73011: done generating all_blocks data 27844 1726882779.73017: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27844 1726882779.73019: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27844 1726882779.73021: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27844 1726882779.73855: done processing included file 27844 1726882779.73857: iterating over new_blocks loaded from include file 27844 1726882779.73859: in VariableManager get_vars() 27844 1726882779.73939: done with get_vars() 27844 1726882779.73941: filtering new block on tags 27844 1726882779.73977: done filtering new block on tags 27844 1726882779.73979: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 27844 1726882779.73984: extending task lists for all hosts with included blocks 27844 1726882779.78408: done extending task lists 27844 1726882779.78410: done processing included files 27844 1726882779.78411: results queue empty 27844 1726882779.78411: checking for any_errors_fatal 27844 1726882779.78415: done checking for any_errors_fatal 27844 1726882779.78416: checking for max_fail_percentage 27844 1726882779.78417: done checking for max_fail_percentage 27844 1726882779.78418: checking to see if all hosts have failed and the running result is not ok 27844 1726882779.78419: done checking to see if all hosts have failed 27844 1726882779.78420: getting the remaining hosts for this loop 27844 1726882779.78421: done getting the remaining hosts for this loop 27844 1726882779.78424: getting the next task for host managed_node1 27844 1726882779.78428: done getting next task for host managed_node1 27844 1726882779.78430: ^ task is: TASK: Check routes and DNS 27844 1726882779.78433: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882779.78436: getting variables 27844 1726882779.78437: in VariableManager get_vars() 27844 1726882779.78452: Calling all_inventory to load vars for managed_node1 27844 1726882779.78454: Calling groups_inventory to load vars for managed_node1 27844 1726882779.78457: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882779.78462: Calling all_plugins_play to load vars for managed_node1 27844 1726882779.78467: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882779.78470: Calling groups_plugins_play to load vars for managed_node1 27844 1726882779.81053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882779.84381: done with get_vars() 27844 1726882779.84409: done getting variables 27844 1726882779.84569: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:39:39 -0400 (0:00:00.214) 0:00:38.922 ****** 27844 1726882779.84603: entering _queue_task() for managed_node1/shell 27844 1726882779.85317: worker is 1 (out of 1 available) 27844 1726882779.85418: exiting _queue_task() for managed_node1/shell 27844 1726882779.85431: done queuing things up, now waiting for results queue to drain 27844 1726882779.85433: waiting for pending results... 27844 1726882779.85740: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 27844 1726882779.85879: in run() - task 0e448fcc-3ce9-efa9-466a-000000000bb6 27844 1726882779.85909: variable 'ansible_search_path' from source: unknown 27844 1726882779.85919: variable 'ansible_search_path' from source: unknown 27844 1726882779.85965: calling self._execute() 27844 1726882779.86068: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.86081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.86095: variable 'omit' from source: magic vars 27844 1726882779.86479: variable 'ansible_distribution_major_version' from source: facts 27844 1726882779.86498: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882779.86510: variable 'omit' from source: magic vars 27844 1726882779.86572: variable 'omit' from source: magic vars 27844 1726882779.86611: variable 'omit' from source: magic vars 27844 1726882779.86666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882779.86709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882779.86884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882779.86908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882779.86927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882779.86958: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882779.86976: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.86988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.87215: Set connection var ansible_shell_type to sh 27844 1726882779.87224: Set connection var ansible_connection to ssh 27844 1726882779.87312: Set connection var ansible_pipelining to False 27844 1726882779.87324: Set connection var ansible_timeout to 10 27844 1726882779.87334: Set connection var ansible_shell_executable to /bin/sh 27844 1726882779.87344: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882779.87381: variable 'ansible_shell_executable' from source: unknown 27844 1726882779.87391: variable 'ansible_connection' from source: unknown 27844 1726882779.87415: variable 'ansible_module_compression' from source: unknown 27844 1726882779.87523: variable 'ansible_shell_type' from source: unknown 27844 1726882779.87535: variable 'ansible_shell_executable' from source: unknown 27844 1726882779.87542: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882779.87549: variable 'ansible_pipelining' from source: unknown 27844 1726882779.87556: variable 'ansible_timeout' from source: unknown 27844 1726882779.87562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882779.87812: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882779.87829: variable 'omit' from source: magic vars 27844 1726882779.87839: starting attempt loop 27844 1726882779.87859: running the handler 27844 1726882779.87877: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882779.87999: _low_level_execute_command(): starting 27844 1726882779.88012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882779.89872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882779.89888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.89898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.89919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.89956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.90032: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882779.90043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.90058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882779.90065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882779.90075: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882779.90084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.90095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.90107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.90115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.90123: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882779.90138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.90209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882779.90345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882779.90362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882779.90489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882779.92162: stdout chunk (state=3): >>>/root <<< 27844 1726882779.92332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882779.92335: stdout chunk (state=3): >>><<< 27844 1726882779.92346: stderr chunk (state=3): >>><<< 27844 1726882779.92371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882779.92386: _low_level_execute_command(): starting 27844 1726882779.92396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611 `" && echo ansible-tmp-1726882779.923704-29613-224461680793611="` echo /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611 `" ) && sleep 0' 27844 1726882779.93594: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882779.93986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.93995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.94024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.94059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.94070: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882779.94077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.94093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882779.94117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882779.94124: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882779.94144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.94147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.94318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.94327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.94334: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882779.94343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882779.94418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882779.94435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882779.94446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882779.94565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882779.96441: stdout chunk (state=3): >>>ansible-tmp-1726882779.923704-29613-224461680793611=/root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611 <<< 27844 1726882779.96609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882779.96612: stdout chunk (state=3): >>><<< 27844 1726882779.96620: stderr chunk (state=3): >>><<< 27844 1726882779.96637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882779.923704-29613-224461680793611=/root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882779.96673: variable 'ansible_module_compression' from source: unknown 27844 1726882779.96726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882779.96758: variable 'ansible_facts' from source: unknown 27844 1726882779.96831: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/AnsiballZ_command.py 27844 1726882779.97923: Sending initial data 27844 1726882779.97926: Sent initial data (155 bytes) 27844 1726882779.99901: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882779.99910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882779.99920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882779.99935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882779.99974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882779.99982: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882779.99993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.00005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.00013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.00020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.00027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.00036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.00048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.00055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.00061: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.00075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.00146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.00169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.00180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.00300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.02029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882780.02119: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882780.02214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmppxifzuxf /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/AnsiballZ_command.py <<< 27844 1726882780.02305: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882780.03872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.03952: stderr chunk (state=3): >>><<< 27844 1726882780.03955: stdout chunk (state=3): >>><<< 27844 1726882780.03980: done transferring module to remote 27844 1726882780.03992: _low_level_execute_command(): starting 27844 1726882780.03997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/ /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/AnsiballZ_command.py && sleep 0' 27844 1726882780.05796: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.05823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.05839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.05856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.05923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.06050: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.06068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.06086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.06097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.06113: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.06131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.06149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.06173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.06185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.06195: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.06213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.06315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.06385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.06401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.06593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.08416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.08419: stdout chunk (state=3): >>><<< 27844 1726882780.08421: stderr chunk (state=3): >>><<< 27844 1726882780.08513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882780.08517: _low_level_execute_command(): starting 27844 1726882780.08519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/AnsiballZ_command.py && sleep 0' 27844 1726882780.09303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.09307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.09339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.09342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.09344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.09347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.09412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.09416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.09528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.23362: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2691sec preferred_lft 2691sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:40.223699", "end": "2024-09-20 21:39:40.231861", "delta": "0:00:00.008162", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882780.24570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882780.24610: stderr chunk (state=3): >>><<< 27844 1726882780.24614: stdout chunk (state=3): >>><<< 27844 1726882780.24635: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2691sec preferred_lft 2691sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:40.223699", "end": "2024-09-20 21:39:40.231861", "delta": "0:00:00.008162", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882780.24689: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882780.24697: _low_level_execute_command(): starting 27844 1726882780.24702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882779.923704-29613-224461680793611/ > /dev/null 2>&1 && sleep 0' 27844 1726882780.28014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.28039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.28052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.28082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.28119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.28126: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.28140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.28206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.28294: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.28361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.28454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.28493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.28504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.28512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.28534: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.28746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.29057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.29115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.29182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.29381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.31196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.31296: stderr chunk (state=3): >>><<< 27844 1726882780.31299: stdout chunk (state=3): >>><<< 27844 1726882780.31324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882780.31343: handler run complete 27844 1726882780.31372: Evaluated conditional (False): False 27844 1726882780.31382: attempt loop complete, returning result 27844 1726882780.31388: _execute() done 27844 1726882780.31391: dumping result to json 27844 1726882780.31437: done dumping result, returning 27844 1726882780.31446: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0e448fcc-3ce9-efa9-466a-000000000bb6] 27844 1726882780.31453: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000bb6 27844 1726882780.31648: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000bb6 27844 1726882780.31650: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008162", "end": "2024-09-20 21:39:40.231861", "rc": 0, "start": "2024-09-20 21:39:40.223699" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2691sec preferred_lft 2691sec inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 27844 1726882780.31744: no more pending results, returning what we have 27844 1726882780.31749: results queue empty 27844 1726882780.31750: checking for any_errors_fatal 27844 1726882780.31752: done checking for any_errors_fatal 27844 1726882780.31753: checking for max_fail_percentage 27844 1726882780.31755: done checking for max_fail_percentage 27844 1726882780.31756: checking to see if all hosts have failed and the running result is not ok 27844 1726882780.31757: done checking to see if all hosts have failed 27844 1726882780.31758: getting the remaining hosts for this loop 27844 1726882780.31760: done getting the remaining hosts for this loop 27844 1726882780.31780: getting the next task for host managed_node1 27844 1726882780.31788: done getting next task for host managed_node1 27844 1726882780.31792: ^ task is: TASK: Verify DNS and network connectivity 27844 1726882780.31796: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27844 1726882780.31800: getting variables 27844 1726882780.31803: in VariableManager get_vars() 27844 1726882780.31846: Calling all_inventory to load vars for managed_node1 27844 1726882780.31849: Calling groups_inventory to load vars for managed_node1 27844 1726882780.31852: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882780.31867: Calling all_plugins_play to load vars for managed_node1 27844 1726882780.31870: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882780.31874: Calling groups_plugins_play to load vars for managed_node1 27844 1726882780.34790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882780.38182: done with get_vars() 27844 1726882780.38316: done getting variables 27844 1726882780.38485: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:39:40 -0400 (0:00:00.539) 0:00:39.462 ****** 27844 1726882780.38561: entering _queue_task() for managed_node1/shell 27844 1726882780.39008: worker is 1 (out of 1 available) 27844 1726882780.39029: exiting _queue_task() for managed_node1/shell 27844 1726882780.39042: done queuing things up, now waiting for results queue to drain 27844 1726882780.39044: waiting for pending results... 27844 1726882780.39360: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 27844 1726882780.39601: in run() - task 0e448fcc-3ce9-efa9-466a-000000000bb7 27844 1726882780.39637: variable 'ansible_search_path' from source: unknown 27844 1726882780.39642: variable 'ansible_search_path' from source: unknown 27844 1726882780.39683: calling self._execute() 27844 1726882780.39786: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882780.39797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882780.39812: variable 'omit' from source: magic vars 27844 1726882780.40218: variable 'ansible_distribution_major_version' from source: facts 27844 1726882780.40242: Evaluated conditional (ansible_distribution_major_version != '6'): True 27844 1726882780.40481: variable 'ansible_facts' from source: unknown 27844 1726882780.41404: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 27844 1726882780.41410: variable 'omit' from source: magic vars 27844 1726882780.41488: variable 'omit' from source: magic vars 27844 1726882780.41525: variable 'omit' from source: magic vars 27844 1726882780.41590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27844 1726882780.41625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27844 1726882780.41662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27844 1726882780.41686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882780.41700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27844 1726882780.41728: variable 'inventory_hostname' from source: host vars for 'managed_node1' 27844 1726882780.41731: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882780.41734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882780.41877: Set connection var ansible_shell_type to sh 27844 1726882780.41880: Set connection var ansible_connection to ssh 27844 1726882780.41888: Set connection var ansible_pipelining to False 27844 1726882780.41894: Set connection var ansible_timeout to 10 27844 1726882780.41907: Set connection var ansible_shell_executable to /bin/sh 27844 1726882780.41914: Set connection var ansible_module_compression to ZIP_DEFLATED 27844 1726882780.41943: variable 'ansible_shell_executable' from source: unknown 27844 1726882780.41946: variable 'ansible_connection' from source: unknown 27844 1726882780.41949: variable 'ansible_module_compression' from source: unknown 27844 1726882780.41951: variable 'ansible_shell_type' from source: unknown 27844 1726882780.41953: variable 'ansible_shell_executable' from source: unknown 27844 1726882780.41956: variable 'ansible_host' from source: host vars for 'managed_node1' 27844 1726882780.41958: variable 'ansible_pipelining' from source: unknown 27844 1726882780.41960: variable 'ansible_timeout' from source: unknown 27844 1726882780.41969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 27844 1726882780.42135: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882780.42146: variable 'omit' from source: magic vars 27844 1726882780.42151: starting attempt loop 27844 1726882780.42154: running the handler 27844 1726882780.42170: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 27844 1726882780.42186: _low_level_execute_command(): starting 27844 1726882780.42207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27844 1726882780.43298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.43314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.43325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.43340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.43385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.43396: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.43406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.43425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.43433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.43439: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.43447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.43458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.43477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.43485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.43492: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.43506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.43589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.43610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.43756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.43981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.45585: stdout chunk (state=3): >>>/root <<< 27844 1726882780.45746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.45749: stdout chunk (state=3): >>><<< 27844 1726882780.45767: stderr chunk (state=3): >>><<< 27844 1726882780.45793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882780.45807: _low_level_execute_command(): starting 27844 1726882780.45813: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724 `" && echo ansible-tmp-1726882780.4579258-29641-99590158694724="` echo /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724 `" ) && sleep 0' 27844 1726882780.46478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.46486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.46496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.46509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.46548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.46563: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.46577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.46589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.46596: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.46603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.46611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.46621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.46634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.46641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.46648: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.46658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.46737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.46754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.46769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.46902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.48745: stdout chunk (state=3): >>>ansible-tmp-1726882780.4579258-29641-99590158694724=/root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724 <<< 27844 1726882780.48923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.48926: stdout chunk (state=3): >>><<< 27844 1726882780.48933: stderr chunk (state=3): >>><<< 27844 1726882780.48950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882780.4579258-29641-99590158694724=/root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882780.48988: variable 'ansible_module_compression' from source: unknown 27844 1726882780.49041: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-27844kglw_zwt/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27844 1726882780.49074: variable 'ansible_facts' from source: unknown 27844 1726882780.49164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/AnsiballZ_command.py 27844 1726882780.49915: Sending initial data 27844 1726882780.49918: Sent initial data (155 bytes) 27844 1726882780.52660: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.52671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.52681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.52695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.52731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.52739: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.52753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.52775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.52782: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.52789: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.52797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.52806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.52817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.52825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.52831: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.52840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.52933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.53093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.53104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.53221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.55011: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27844 1726882780.55102: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 27844 1726882780.55197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-27844kglw_zwt/tmpsq47h022 /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/AnsiballZ_command.py <<< 27844 1726882780.55289: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 27844 1726882780.56724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.56962: stderr chunk (state=3): >>><<< 27844 1726882780.56970: stdout chunk (state=3): >>><<< 27844 1726882780.56972: done transferring module to remote 27844 1726882780.56975: _low_level_execute_command(): starting 27844 1726882780.56977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/ /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/AnsiballZ_command.py && sleep 0' 27844 1726882780.58416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.58585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.58599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.58616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.58659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.58678: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.58692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.58708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.58719: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.58728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.58739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.58752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.58771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.58784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.58796: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.58809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.58888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.58904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.59580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.59707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.61561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882780.61569: stdout chunk (state=3): >>><<< 27844 1726882780.61571: stderr chunk (state=3): >>><<< 27844 1726882780.61661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882780.61669: _low_level_execute_command(): starting 27844 1726882780.61672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/AnsiballZ_command.py && sleep 0' 27844 1726882780.63219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27844 1726882780.63232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.63245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.63261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.63307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.63319: stderr chunk (state=3): >>>debug2: match not found <<< 27844 1726882780.63332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.63348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27844 1726882780.63358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 27844 1726882780.63373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27844 1726882780.63385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.63397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.63411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.63421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 27844 1726882780.63433: stderr chunk (state=3): >>>debug2: match found <<< 27844 1726882780.63445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.63530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.63686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.63700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.64398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882780.96347: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6100 0 --:--:-- --:--:-- --:--:-- 6224\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:39:40.772650", "end": "2024-09-20 21:39:40.961601", "delta": "0:00:00.188951", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27844 1726882780.97698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 27844 1726882780.97702: stdout chunk (state=3): >>><<< 27844 1726882780.97704: stderr chunk (state=3): >>><<< 27844 1726882780.97874: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6100 0 --:--:-- --:--:-- --:--:-- 6224\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:39:40.772650", "end": "2024-09-20 21:39:40.961601", "delta": "0:00:00.188951", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 27844 1726882780.97879: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27844 1726882780.97881: _low_level_execute_command(): starting 27844 1726882780.97884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882780.4579258-29641-99590158694724/ > /dev/null 2>&1 && sleep 0' 27844 1726882780.98573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27844 1726882780.98594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27844 1726882780.98642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.98646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27844 1726882780.98661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27844 1726882780.98720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 27844 1726882780.98727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27844 1726882780.98729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27844 1726882780.98833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27844 1726882781.00736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27844 1726882781.00740: stdout chunk (state=3): >>><<< 27844 1726882781.00742: stderr chunk (state=3): >>><<< 27844 1726882781.01177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27844 1726882781.01184: handler run complete 27844 1726882781.01187: Evaluated conditional (False): False 27844 1726882781.01189: attempt loop complete, returning result 27844 1726882781.01191: _execute() done 27844 1726882781.01193: dumping result to json 27844 1726882781.01195: done dumping result, returning 27844 1726882781.01196: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-efa9-466a-000000000bb7] 27844 1726882781.01199: sending task result for task 0e448fcc-3ce9-efa9-466a-000000000bb7 27844 1726882781.01276: done sending task result for task 0e448fcc-3ce9-efa9-466a-000000000bb7 27844 1726882781.01280: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.188951", "end": "2024-09-20 21:39:40.961601", "rc": 0, "start": "2024-09-20 21:39:40.772650" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6100 0 --:--:-- --:--:-- --:--:-- 6224 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508 27844 1726882781.01397: no more pending results, returning what we have 27844 1726882781.01401: results queue empty 27844 1726882781.01402: checking for any_errors_fatal 27844 1726882781.01410: done checking for any_errors_fatal 27844 1726882781.01410: checking for max_fail_percentage 27844 1726882781.01412: done checking for max_fail_percentage 27844 1726882781.01413: checking to see if all hosts have failed and the running result is not ok 27844 1726882781.01414: done checking to see if all hosts have failed 27844 1726882781.01415: getting the remaining hosts for this loop 27844 1726882781.01416: done getting the remaining hosts for this loop 27844 1726882781.01419: getting the next task for host managed_node1 27844 1726882781.01429: done getting next task for host managed_node1 27844 1726882781.01431: ^ task is: TASK: meta (flush_handlers) 27844 1726882781.01433: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882781.01437: getting variables 27844 1726882781.01439: in VariableManager get_vars() 27844 1726882781.01481: Calling all_inventory to load vars for managed_node1 27844 1726882781.01484: Calling groups_inventory to load vars for managed_node1 27844 1726882781.01486: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882781.01497: Calling all_plugins_play to load vars for managed_node1 27844 1726882781.01499: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882781.01502: Calling groups_plugins_play to load vars for managed_node1 27844 1726882781.04653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882781.07631: done with get_vars() 27844 1726882781.07659: done getting variables 27844 1726882781.07747: in VariableManager get_vars() 27844 1726882781.07768: Calling all_inventory to load vars for managed_node1 27844 1726882781.07771: Calling groups_inventory to load vars for managed_node1 27844 1726882781.07774: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882781.07779: Calling all_plugins_play to load vars for managed_node1 27844 1726882781.07781: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882781.07789: Calling groups_plugins_play to load vars for managed_node1 27844 1726882781.15561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882781.17969: done with get_vars() 27844 1726882781.18010: done queuing things up, now waiting for results queue to drain 27844 1726882781.18012: results queue empty 27844 1726882781.18013: checking for any_errors_fatal 27844 1726882781.18018: done checking for any_errors_fatal 27844 1726882781.18018: checking for max_fail_percentage 27844 1726882781.18019: done checking for max_fail_percentage 27844 1726882781.18020: checking to see if all hosts have failed and the running result is not ok 27844 1726882781.18021: done checking to see if all hosts have failed 27844 1726882781.18023: getting the remaining hosts for this loop 27844 1726882781.18024: done getting the remaining hosts for this loop 27844 1726882781.18027: getting the next task for host managed_node1 27844 1726882781.18032: done getting next task for host managed_node1 27844 1726882781.18034: ^ task is: TASK: meta (flush_handlers) 27844 1726882781.18036: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882781.18039: getting variables 27844 1726882781.18040: in VariableManager get_vars() 27844 1726882781.18055: Calling all_inventory to load vars for managed_node1 27844 1726882781.18057: Calling groups_inventory to load vars for managed_node1 27844 1726882781.18058: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882781.18063: Calling all_plugins_play to load vars for managed_node1 27844 1726882781.18066: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882781.18069: Calling groups_plugins_play to load vars for managed_node1 27844 1726882781.18924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882781.20006: done with get_vars() 27844 1726882781.20023: done getting variables 27844 1726882781.20060: in VariableManager get_vars() 27844 1726882781.20075: Calling all_inventory to load vars for managed_node1 27844 1726882781.20077: Calling groups_inventory to load vars for managed_node1 27844 1726882781.20079: Calling all_plugins_inventory to load vars for managed_node1 27844 1726882781.20083: Calling all_plugins_play to load vars for managed_node1 27844 1726882781.20089: Calling groups_plugins_inventory to load vars for managed_node1 27844 1726882781.20091: Calling groups_plugins_play to load vars for managed_node1 27844 1726882781.20797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27844 1726882781.21721: done with get_vars() 27844 1726882781.21740: done queuing things up, now waiting for results queue to drain 27844 1726882781.21742: results queue empty 27844 1726882781.21742: checking for any_errors_fatal 27844 1726882781.21743: done checking for any_errors_fatal 27844 1726882781.21744: checking for max_fail_percentage 27844 1726882781.21744: done checking for max_fail_percentage 27844 1726882781.21745: checking to see if all hosts have failed and the running result is not ok 27844 1726882781.21745: done checking to see if all hosts have failed 27844 1726882781.21746: getting the remaining hosts for this loop 27844 1726882781.21747: done getting the remaining hosts for this loop 27844 1726882781.21749: getting the next task for host managed_node1 27844 1726882781.21751: done getting next task for host managed_node1 27844 1726882781.21751: ^ task is: None 27844 1726882781.21753: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27844 1726882781.21753: done queuing things up, now waiting for results queue to drain 27844 1726882781.21754: results queue empty 27844 1726882781.21754: checking for any_errors_fatal 27844 1726882781.21755: done checking for any_errors_fatal 27844 1726882781.21755: checking for max_fail_percentage 27844 1726882781.21756: done checking for max_fail_percentage 27844 1726882781.21756: checking to see if all hosts have failed and the running result is not ok 27844 1726882781.21756: done checking to see if all hosts have failed 27844 1726882781.21758: getting the next task for host managed_node1 27844 1726882781.21759: done getting next task for host managed_node1 27844 1726882781.21760: ^ task is: None 27844 1726882781.21761: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=107 changed=3 unreachable=0 failed=0 skipped=88 rescued=0 ignored=2 Friday 20 September 2024 21:39:41 -0400 (0:00:00.832) 0:00:40.294 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.57s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.47s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.43s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Install iproute --------------------------------------------------------- 1.39s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Create veth interface ethtest0 ------------------------------------------ 1.16s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface ethtest1 ------------------------------------------ 1.07s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Verify DNS and network connectivity ------------------------------------- 0.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather current interface info ------------------------------------------- 0.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.65s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.60s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 27844 1726882781.21839: RUNNING CLEANUP