Formatting '/cache/rhel-8-y.qcow2.snap', fmt=qcow2 cluster_size=65536 extended_l2=off compression_type=zlib size=10737418240 backing_file=/cache/rhel-8-y.qcow2 backing_fmt=qcow2 lazy_refcounts=off refcount_bits=16 ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpwnjj5ib_ executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: rhel-8-y_setup.yml *************************************************** 2 plays in /cache/rhel-8-y_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-8-y_setup.yml:5 Wednesday 27 July 2022 21:14:12 +0000 (0:00:00.017) 0:00:00.017 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8-y.qcow2.snap] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8-y.qcow2.snap] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY [Set up host for test playbooks] ****************************************** TASK [Gathering Facts] ********************************************************* task path: /cache/rhel-8-y_setup.yml:19 Wednesday 27 July 2022 21:14:13 +0000 (0:00:01.383) 0:00:01.401 ******** ok: [/cache/rhel-8-y.qcow2.snap] META: ran handlers TASK [Create EPEL 8 repo] ****************************************************** task path: /cache/rhel-8-y_setup.yml:23 Wednesday 27 July 2022 21:14:14 +0000 (0:00:01.061) 0:00:02.462 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [Create yum cache] ******************************************************** task path: /cache/rhel-8-y_setup.yml:33 Wednesday 27 July 2022 21:14:21 +0000 (0:00:06.440) 0:00:08.902 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Create dnf cache] ******************************************************** task path: /cache/rhel-8-y_setup.yml:39 Wednesday 27 July 2022 21:14:21 +0000 (0:00:00.018) 0:00:08.920 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [Disable EPEL 7] ********************************************************** task path: /cache/rhel-8-y_setup.yml:45 Wednesday 27 July 2022 21:14:30 +0000 (0:00:09.395) 0:00:18.316 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Disable EPEL 8] ********************************************************** task path: /cache/rhel-8-y_setup.yml:53 Wednesday 27 July 2022 21:14:30 +0000 (0:00:00.023) 0:00:18.339 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-8-y.qcow2.snap : ok=5 changed=4 unreachable=0 failed=0 skipped=2 rescued=0 ignored=0 Wednesday 27 July 2022 21:14:31 +0000 (0:00:00.814) 0:00:19.154 ******** =============================================================================== Create dnf cache -------------------------------------------------------- 9.40s /cache/rhel-8-y_setup.yml:39 -------------------------------------------------- Create EPEL 8 repo ------------------------------------------------------ 6.44s /cache/rhel-8-y_setup.yml:23 -------------------------------------------------- set up internal repositories -------------------------------------------- 1.38s /cache/rhel-8-y_setup.yml:5 --------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.06s /cache/rhel-8-y_setup.yml:19 -------------------------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.81s /cache/rhel-8-y_setup.yml:53 -------------------------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:45 -------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:33 -------------------------------------------------- PLAYBOOK: setup-snapshot.yml *************************************************** 1 plays in /tmp/tmp1l16wi83/tests/setup-snapshot.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp1l16wi83/tests/setup-snapshot.yml:2 Wednesday 27 July 2022 21:14:31 +0000 (0:00:00.011) 0:00:19.166 ******** ok: [/cache/rhel-8-y.qcow2.snap] META: ran handlers TASK [Set platform/version specific variables] ********************************* task path: /tmp/tmp1l16wi83/tests/setup-snapshot.yml:4 Wednesday 27 July 2022 21:14:32 +0000 (0:00:00.823) 0:00:19.989 ******** TASK [linux-system-roles.podman : Ensure ansible_facts used by role] *********** task path: /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 Wednesday 27 July 2022 21:14:32 +0000 (0:00:00.030) 0:00:20.020 ******** ok: [/cache/rhel-8-y.qcow2.snap] TASK [linux-system-roles.podman : Set platform/version specific variables] ***** task path: /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 Wednesday 27 July 2022 21:14:32 +0000 (0:00:00.510) 0:00:20.530 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat_8.7.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.7.yml", "skip_reason": "Conditional result was False" } META: role_complete for /cache/rhel-8-y.qcow2.snap TASK [Install test packages] *************************************************** task path: /tmp/tmp1l16wi83/tests/setup-snapshot.yml:10 Wednesday 27 July 2022 21:14:32 +0000 (0:00:00.047) 0:00:20.578 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "rc": 0, "results": [ "Installed: criu-3.15-3.module+el8.7.0+15895+a6753917.x86_64", "Installed: iptables-1.8.4-22.el8.x86_64", "Installed: iptables-libs-1.8.4-22.el8.x86_64", "Installed: slirp4netns-1.2.0-2.module+el8.7.0+15895+a6753917.x86_64", "Installed: runc-1:1.1.3-2.module+el8.7.0+15895+a6753917.x86_64", "Installed: libnetfilter_conntrack-1.0.6-5.el8.x86_64", "Installed: libnfnetlink-1.0.1-13.el8.x86_64", "Installed: libnftnl-1.1.5-5.el8.x86_64", "Installed: libnet-1.1.6-15.el8.x86_64", "Installed: fuse-common-3.3.0-16.el8.x86_64", "Installed: protobuf-c-1.3.0-6.el8.x86_64", "Installed: fuse3-3.3.0-16.el8.x86_64", "Installed: shadow-utils-subid-2:4.6-16.el8.x86_64", "Installed: fuse3-libs-3.3.0-16.el8.x86_64", "Installed: nftables-1:0.9.3-26.el8.x86_64", "Installed: podman-2:4.1.1-6.module+el8.7.0+15895+a6753917.x86_64", "Installed: conmon-2:2.1.2-2.module+el8.7.0+15895+a6753917.x86_64", "Installed: fuse-overlayfs-1.9-1.module+el8.7.0+15895+a6753917.x86_64", "Installed: podman-catatonit-2:4.1.1-6.module+el8.7.0+15895+a6753917.x86_64", "Installed: container-selinux-2:2.188.0-1.module+el8.7.0+15895+a6753917.noarch", "Installed: containernetworking-plugins-1:1.1.1-3.module+el8.7.0+15895+a6753917.x86_64", "Installed: containers-common-2:1-34.module+el8.7.0+15895+a6753917.x86_64", "Installed: libslirp-4.4.0-1.module+el8.7.0+15895+a6753917.x86_64" ] } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-8-y.qcow2.snap : ok=8 changed=5 unreachable=0 failed=0 skipped=3 rescued=0 ignored=0 Wednesday 27 July 2022 21:15:10 +0000 (0:00:37.993) 0:00:58.572 ******** =============================================================================== Install test packages -------------------------------------------------- 37.99s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:10 ---------------------------------- Create dnf cache -------------------------------------------------------- 9.40s /cache/rhel-8-y_setup.yml:39 -------------------------------------------------- Create EPEL 8 repo ------------------------------------------------------ 6.44s /cache/rhel-8-y_setup.yml:23 -------------------------------------------------- set up internal repositories -------------------------------------------- 1.38s /cache/rhel-8-y_setup.yml:5 --------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.06s /cache/rhel-8-y_setup.yml:19 -------------------------------------------------- Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:2 ----------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.81s /cache/rhel-8-y_setup.yml:53 -------------------------------------------------- linux-system-roles.podman : Ensure ansible_facts used by role ----------- 0.51s /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 --- linux-system-roles.podman : Set platform/version specific variables ----- 0.05s /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 --- Set platform/version specific variables --------------------------------- 0.03s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:4 ----------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:45 -------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:33 -------------------------------------------------- PLAYBOOK: rhel-8-y_post_setup.yml ********************************************** 1 plays in /cache/rhel-8-y_post_setup.yml PLAY [Post setup - these happen last] ****************************************** META: ran handlers TASK [force sync of filesystems - ensure setup changes are made to snapshot] *** task path: /cache/rhel-8-y_post_setup.yml:5 Wednesday 27 July 2022 21:15:10 +0000 (0:00:00.018) 0:00:58.590 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [shutdown guest] ********************************************************** task path: /cache/rhel-8-y_post_setup.yml:8 Wednesday 27 July 2022 21:15:11 +0000 (0:00:00.354) 0:00:58.944 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-8-y.qcow2.snap : ok=10 changed=7 unreachable=0 failed=0 skipped=3 rescued=0 ignored=0 Wednesday 27 July 2022 21:15:11 +0000 (0:00:00.505) 0:00:59.450 ******** =============================================================================== Install test packages -------------------------------------------------- 37.99s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:10 ---------------------------------- Create dnf cache -------------------------------------------------------- 9.40s /cache/rhel-8-y_setup.yml:39 -------------------------------------------------- Create EPEL 8 repo ------------------------------------------------------ 6.44s /cache/rhel-8-y_setup.yml:23 -------------------------------------------------- set up internal repositories -------------------------------------------- 1.38s /cache/rhel-8-y_setup.yml:5 --------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.06s /cache/rhel-8-y_setup.yml:19 -------------------------------------------------- Gathering Facts --------------------------------------------------------- 0.82s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:2 ----------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.81s /cache/rhel-8-y_setup.yml:53 -------------------------------------------------- linux-system-roles.podman : Ensure ansible_facts used by role ----------- 0.51s /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 --- shutdown guest ---------------------------------------------------------- 0.51s /cache/rhel-8-y_post_setup.yml:8 ---------------------------------------------- force sync of filesystems - ensure setup changes are made to snapshot --- 0.35s /cache/rhel-8-y_post_setup.yml:5 ---------------------------------------------- linux-system-roles.podman : Set platform/version specific variables ----- 0.05s /tmp/tmp1l16wi83/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 --- Set platform/version specific variables --------------------------------- 0.03s /tmp/tmp1l16wi83/tests/setup-snapshot.yml:4 ----------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:45 -------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/rhel-8-y_setup.yml:33 -------------------------------------------------- ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpwnjj5ib_ executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_basic.yml ****************************************************** 1 plays in /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml PLAY [Ensure that the role runs with default parameters] *********************** TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:3 Wednesday 27 July 2022 21:15:55 +0000 (0:00:00.014) 0:00:00.014 ******** ok: [/cache/rhel-8-y.qcow2.snap] META: ran handlers TASK [Enable podman copr] ****************************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:136 Wednesday 27 July 2022 21:15:56 +0000 (0:00:01.195) 0:00:01.210 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "cmd": [ "dnf", "copr", "enable", "rhcontainerbot/podman-next", "-y" ], "delta": "0:00:00.679158", "end": "2022-07-27 17:15:57.242382", "rc": 0, "start": "2022-07-27 17:15:56.563224" } STDOUT: Repository successfully enabled. STDERR: Enabling a Copr repository. Please note that this repository is not part of the main distribution, and quality may vary. The Fedora Project does not exercise any power over the contents of this repository beyond the rules outlined in the Copr FAQ at , and packages are not held to any quality or security level. Please do not file bug reports about these packages in Fedora Bugzilla. In case of problems, contact the owner of this repository. TASK [Install podman from updates-testing] ************************************* task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:139 Wednesday 27 July 2022 21:15:58 +0000 (0:00:01.124) 0:00:02.335 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "cmd": [ "dnf", "-y", "install", "podman" ], "delta": "0:00:01.733275", "end": "2022-07-27 17:15:59.326145", "rc": 0, "start": "2022-07-27 17:15:57.592870" } STDOUT: Copr repo for podman-next owned by rhcontainerb 8.6 MB/s | 1.8 MB 00:00 Package podman-2:4.1.1-6.module+el8.7.0+15895+a6753917.x86_64 is already installed. Dependencies resolved. Nothing to do. Complete! TASK [Podman version] ********************************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:142 Wednesday 27 July 2022 21:16:00 +0000 (0:00:02.092) 0:00:04.427 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "cmd": [ "podman", "--version" ], "delta": "0:00:00.061174", "end": "2022-07-27 17:15:59.748345", "rc": 0, "start": "2022-07-27 17:15:59.687171" } STDOUT: podman version 4.1.1 TASK [Create user] ************************************************************* task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:145 Wednesday 27 July 2022 21:16:00 +0000 (0:00:00.410) 0:00:04.838 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "comment": "", "create_home": true, "group": 1001, "home": "/home/user1", "name": "user1", "shell": "/bin/bash", "state": "present", "system": false, "uid": 1001 } TASK [Create tempfile for kube_src] ******************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:149 Wednesday 27 July 2022 21:16:01 +0000 (0:00:00.780) 0:00:05.619 ******** changed: [/cache/rhel-8-y.qcow2.snap -> localhost] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_podman_k4vfy_7n.yml", "size": 0, "state": "file", "uid": 0 } TASK [Write kube_file_src] ***************************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:157 Wednesday 27 July 2022 21:16:01 +0000 (0:00:00.317) 0:00:05.937 ******** changed: [/cache/rhel-8-y.qcow2.snap -> localhost] => { "changed": true, "checksum": "7c999d33fe2b60b3c65ec0a85b8924cc4e970d83", "dest": "/tmp/lsr_podman_k4vfy_7n.yml", "gid": 0, "group": "root", "md5sum": "4807f59df21780236988d1eb89142aa2", "mode": "0600", "owner": "root", "size": 665, "src": "/root/.ansible/tmp/ansible-tmp-1658956561.651799-34222-108758298658742/source", "state": "file", "uid": 0 } TASK [Create host directories for data] **************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:163 Wednesday 27 July 2022 21:16:02 +0000 (0:00:00.588) 0:00:06.525 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "mode": "0755", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 1001 } changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd2", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0 } changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd3", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [Create data files] ******************************************************* task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:171 Wednesday 27 July 2022 21:16:03 +0000 (0:00:01.199) 0:00:07.724 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd1/index.txt", "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "user1", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956563.465736-34287-180223810980796/source", "state": "file", "uid": 1001 } changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd2/index.txt", "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956564.099485-34287-201298365596915/source", "state": "file", "uid": 0 } changed: [/cache/rhel-8-y.qcow2.snap] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd3/index.txt", "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956564.7314875-34287-264587028996914/source", "state": "file", "uid": 0 } TASK [Run role] **************************************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:179 Wednesday 27 July 2022 21:16:05 +0000 (0:00:01.939) 0:00:09.664 ******** TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Wednesday 27 July 2022 21:16:05 +0000 (0:00:00.038) 0:00:09.703 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Wednesday 27 July 2022 21:16:05 +0000 (0:00:00.030) 0:00:09.733 ******** ok: [/cache/rhel-8-y.qcow2.snap] TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:8 Wednesday 27 July 2022 21:16:05 +0000 (0:00:00.511) 0:00:10.244 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8-y.qcow2.snap] => (item=RedHat_8.7.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.7.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Wednesday 27 July 2022 21:16:05 +0000 (0:00:00.044) 0:00:10.289 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:11 Wednesday 27 July 2022 21:16:07 +0000 (0:00:01.528) 0:00:11.817 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - system] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:3 Wednesday 27 July 2022 21:16:07 +0000 (0:00:00.034) 0:00:11.851 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/containers.conf.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - user] **** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:12 Wednesday 27 July 2022 21:16:07 +0000 (0:00:00.374) 0:00:12.226 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system container config file] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:21 Wednesday 27 July 2022 21:16:07 +0000 (0:00:00.024) 0:00:12.250 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "checksum": "c56b033cc4634627f8794d6ccf07a051e0820c07", "dest": "/etc/containers/containers.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "f25228df7b38eaff9b4f63b1b39baa1c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 94, "src": "/root/.ansible/tmp/ansible-tmp-1658956567.9974167-34426-7742901335802/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user container config file] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:32 Wednesday 27 July 2022 21:16:08 +0000 (0:00:00.675) 0:00:12.926 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Wednesday 27 July 2022 21:16:08 +0000 (0:00:00.023) 0:00:12.949 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - system] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:3 Wednesday 27 July 2022 21:16:08 +0000 (0:00:00.033) 0:00:12.983 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/registries.conf.d", "secontext": "system_u:object_r:etc_t:s0", "size": 107, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - user] **** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:12 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.365) 0:00:13.349 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system registries config file] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:21 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.023) 0:00:13.372 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "checksum": "15062ec12da5e642a2b0fb64c5e03d43b80d9cf0", "dest": "/etc/containers/registries.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "88be21c8634b01869b9f694831b84c1d", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 22, "src": "/root/.ansible/tmp/ansible-tmp-1658956569.1224136-34476-129272316490999/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user registries config file] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:32 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.722) 0:00:14.095 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:17 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.023) 0:00:14.118 ******** TASK [fedora.linux_system_roles.firewall : include_tasks] ********************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:1 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.092) 0:00:14.210 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Wednesday 27 July 2022 21:16:09 +0000 (0:00:00.030) 0:00:14.241 ******** ok: [/cache/rhel-8-y.qcow2.snap] TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 Wednesday 27 July 2022 21:16:10 +0000 (0:00:00.498) 0:00:14.739 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "rc": 0, "results": [ "Installed: ipset-libs-7.1-1.el8.x86_64", "Installed: python3-firewall-0.9.3-13.el8.noarch", "Installed: iptables-ebtables-1.8.4-22.el8.x86_64", "Installed: firewalld-0.9.3-13.el8.noarch", "Installed: firewalld-filesystem-0.9.3-13.el8.noarch", "Installed: python3-slip-0.6.4-13.el8.noarch", "Installed: python3-slip-dbus-0.6.4-13.el8.noarch", "Installed: ipset-7.1-1.el8.x86_64", "Installed: python3-nftables-1:0.9.3-26.el8.x86_64" ] } TASK [fedora.linux_system_roles.firewall : Install python-firewall] ************ task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:12 Wednesday 27 July 2022 21:16:13 +0000 (0:00:02.601) 0:00:17.341 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Install python3-firewall] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 Wednesday 27 July 2022 21:16:13 +0000 (0:00:00.027) 0:00:17.368 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 Wednesday 27 July 2022 21:16:14 +0000 (0:00:01.273) 0:00:18.642 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "enabled": true, "name": "firewalld", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus.service sysinit.target polkit.service basic.target dbus.socket system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "multi-user.target shutdown.target network-pre.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "ebtables.service ipset.service ip6tables.service iptables.service shutdown.target nftables.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "man:firewalld(1)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "6963", "LimitNPROCSoft": "6963", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6963", "LimitSIGPENDINGSoft": "6963", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "11140", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Wednesday 27 July 2022 21:16:15 +0000 (0:00:01.004) 0:00:19.647 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:14 Wednesday 27 July 2022 21:16:15 +0000 (0:00:00.055) 0:00:19.702 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:36 Wednesday 27 July 2022 21:16:15 +0000 (0:00:00.035) 0:00:19.738 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item={'port': '8080-8082/tcp', 'state': 'enabled'}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "port": "8080-8082/tcp", "state": "enabled" } } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:66 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.729) 0:00:20.467 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:74 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.033) 0:00:20.500 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:80 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.031) 0:00:20.532 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => {} META: role_complete for /cache/rhel-8-y.qcow2.snap TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:24 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.044) 0:00:20.576 ******** redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.seboolean to ansible.posix.seboolean redirecting (type: modules) ansible.builtin.sefcontext to community.general.sefcontext redirecting (type: modules) community.general.sefcontext to community.general.system.sefcontext redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport redirecting (type: modules) ansible.builtin.selogin to community.general.selogin redirecting (type: modules) community.general.selogin to community.general.system.selogin TASK [fedora.linux_system_roles.selinux : Set ansible_facts required by role and install packages] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:2 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.122) 0:00:20.699 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.selinux : Ensure ansible_facts used by role] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:2 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.036) 0:00:20.736 ******** ok: [/cache/rhel-8-y.qcow2.snap] TASK [fedora.linux_system_roles.selinux : Install SELinux python2 tools] ******* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:7 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.493) 0:00:21.229 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python3 tools] ******* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 Wednesday 27 July 2022 21:16:16 +0000 (0:00:00.024) 0:00:21.254 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.selinux : refresh facts] *********************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 Wednesday 27 July 2022 21:16:18 +0000 (0:00:01.246) 0:00:22.500 ******** ok: [/cache/rhel-8-y.qcow2.snap] TASK [fedora.linux_system_roles.selinux : Install SELinux tool semanage] ******* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 Wednesday 27 July 2022 21:16:18 +0000 (0:00:00.801) 0:00:23.302 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if enabled] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:5 Wednesday 27 July 2022 21:16:20 +0000 (0:00:01.282) 0:00:24.585 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if disabled] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:12 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.023) 0:00:24.608 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set selinux_reboot_required] ********* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:19 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.022) 0:00:24.631 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "selinux_reboot_required": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Fail if reboot is required] ********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:23 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.047) 0:00:24.678 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Warn if SELinux is disabled] ********* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:28 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.022) 0:00:24.700 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => {} TASK [fedora.linux_system_roles.selinux : Drop all local modifications] ******** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:33 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.021) 0:00:24.722 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux boolean local modifications] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:40 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.038) 0:00:24.761 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux file context local modifications] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:44 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.035) 0:00:24.796 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux port local modifications] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:48 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.033) 0:00:24.830 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux login local modifications] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:52 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.033) 0:00:24.864 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set SELinux booleans] **************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:56 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.033) 0:00:24.897 ******** TASK [fedora.linux_system_roles.selinux : Set SELinux file contexts] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:63 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.021) 0:00:24.918 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:72 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.021) 0:00:24.939 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree in check mode] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:78 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.020) 0:00:24.960 ******** TASK [fedora.linux_system_roles.selinux : Set an SELinux label on a port] ****** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 Wednesday 27 July 2022 21:16:20 +0000 (0:00:00.021) 0:00:24.982 ******** redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport changed: [/cache/rhel-8-y.qcow2.snap] => (item={'ports': '8080-8082', 'setype': 'http_port_t'}) => { "ansible_loop_var": "item", "changed": true, "item": { "ports": "8080-8082", "setype": "http_port_t" }, "ports": [ "8080-8082" ], "proto": "tcp", "setype": "http_port_t", "state": "present" } TASK [fedora.linux_system_roles.selinux : Set linux user to SELinux user mapping] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:94 Wednesday 27 July 2022 21:16:22 +0000 (0:00:02.287) 0:00:27.269 ******** TASK [fedora.linux_system_roles.selinux : Get SELinux modules facts] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:104 Wednesday 27 July 2022 21:16:22 +0000 (0:00:00.022) 0:00:27.291 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "selinux_installed_modules": { "abrt": { "100": "enabled" }, "accountsd": { "100": "enabled" }, "acct": { "100": "enabled" }, "afs": { "100": "enabled" }, "aiccu": { "100": "enabled" }, "aide": { "100": "enabled" }, "ajaxterm": { "100": "enabled" }, "alsa": { "100": "enabled" }, "amanda": { "100": "enabled" }, "amtu": { "100": "enabled" }, "anaconda": { "100": "enabled" }, "antivirus": { "100": "enabled" }, "apache": { "100": "enabled" }, "apcupsd": { "100": "enabled" }, "apm": { "100": "enabled" }, "application": { "100": "enabled" }, "arpwatch": { "100": "enabled" }, "asterisk": { "100": "enabled" }, "auditadm": { "100": "enabled" }, "authconfig": { "100": "enabled" }, "authlogin": { "100": "enabled" }, "automount": { "100": "enabled" }, "avahi": { "100": "enabled" }, "awstats": { "100": "enabled" }, "bacula": { "100": "enabled" }, "base": { "100": "enabled" }, "bcfg2": { "100": "enabled" }, "bind": { "100": "enabled" }, "bitlbee": { "100": "enabled" }, "blkmapd": { "100": "enabled" }, "blueman": { "100": "enabled" }, "bluetooth": { "100": "enabled" }, "boinc": { "100": "enabled" }, "boltd": { "100": "enabled" }, "bootloader": { "100": "enabled" }, "brctl": { "100": "enabled" }, "brltty": { "100": "enabled" }, "bugzilla": { "100": "enabled" }, "bumblebee": { "100": "enabled" }, "cachefilesd": { "100": "enabled" }, "calamaris": { "100": "enabled" }, "callweaver": { "100": "enabled" }, "canna": { "100": "enabled" }, "ccs": { "100": "enabled" }, "cdrecord": { "100": "enabled" }, "certmaster": { "100": "enabled" }, "certmonger": { "100": "enabled" }, "certwatch": { "100": "enabled" }, "cfengine": { "100": "enabled" }, "cgdcbxd": { "100": "enabled" }, "cgroup": { "100": "enabled" }, "chrome": { "100": "enabled" }, "chronyd": { "100": "enabled" }, "cinder": { "100": "enabled" }, "cipe": { "100": "enabled" }, "clock": { "100": "enabled" }, "clogd": { "100": "enabled" }, "cloudform": { "100": "enabled" }, "cmirrord": { "100": "enabled" }, "cobbler": { "100": "enabled" }, "cockpit": { "100": "enabled", "200": "enabled" }, "collectd": { "100": "enabled" }, "colord": { "100": "enabled" }, "comsat": { "100": "enabled" }, "condor": { "100": "enabled" }, "conman": { "100": "enabled" }, "conntrackd": { "100": "enabled" }, "consolekit": { "100": "enabled" }, "container": { "200": "enabled" }, "couchdb": { "100": "enabled" }, "courier": { "100": "enabled" }, "cpucontrol": { "100": "enabled" }, "cpufreqselector": { "100": "enabled" }, "cpuplug": { "100": "enabled" }, "cron": { "100": "enabled" }, "ctdb": { "100": "enabled" }, "cups": { "100": "enabled" }, "cvs": { "100": "enabled" }, "cyphesis": { "100": "enabled" }, "cyrus": { "100": "enabled" }, "daemontools": { "100": "enabled" }, "dbadm": { "100": "enabled" }, "dbskk": { "100": "enabled" }, "dbus": { "100": "enabled" }, "dcc": { "100": "enabled" }, "ddclient": { "100": "enabled" }, "denyhosts": { "100": "enabled" }, "devicekit": { "100": "enabled" }, "dhcp": { "100": "enabled" }, "dictd": { "100": "enabled" }, "dirsrv": { "100": "enabled" }, "dirsrv-admin": { "100": "enabled" }, "dmesg": { "100": "enabled" }, "dmidecode": { "100": "enabled" }, "dnsmasq": { "100": "enabled" }, "dnssec": { "100": "enabled" }, "dovecot": { "100": "enabled" }, "drbd": { "100": "enabled" }, "dspam": { "100": "enabled" }, "entropyd": { "100": "enabled" }, "exim": { "100": "enabled" }, "fail2ban": { "100": "enabled" }, "fcoe": { "100": "enabled" }, "fetchmail": { "100": "enabled" }, "finger": { "100": "enabled" }, "firewalld": { "100": "enabled" }, "firewallgui": { "100": "enabled" }, "firstboot": { "100": "enabled" }, "fprintd": { "100": "enabled" }, "freeipmi": { "100": "enabled" }, "freqset": { "100": "enabled" }, "fstools": { "100": "enabled" }, "ftp": { "100": "enabled" }, "fwupd": { "100": "enabled" }, "games": { "100": "enabled" }, "gdomap": { "100": "enabled" }, "geoclue": { "100": "enabled" }, "getty": { "100": "enabled" }, "git": { "100": "enabled" }, "gitosis": { "100": "enabled" }, "glance": { "100": "enabled" }, "gnome": { "100": "enabled" }, "gpg": { "100": "enabled" }, "gpm": { "100": "enabled" }, "gpsd": { "100": "enabled" }, "gssproxy": { "100": "enabled" }, "guest": { "100": "enabled" }, "hddtemp": { "100": "enabled" }, "hostapd": { "100": "enabled" }, "hostname": { "100": "enabled" }, "hsqldb": { "100": "enabled" }, "hwloc": { "100": "enabled" }, "hypervkvp": { "100": "enabled" }, "ibacm": { "100": "enabled" }, "icecast": { "100": "enabled" }, "inetd": { "100": "enabled" }, "init": { "100": "enabled" }, "inn": { "100": "enabled" }, "insights_client": { "100": "enabled" }, "iodine": { "100": "enabled" }, "iotop": { "100": "enabled" }, "ipmievd": { "100": "enabled" }, "ipsec": { "100": "enabled" }, "iptables": { "100": "enabled" }, "irc": { "100": "enabled" }, "irqbalance": { "100": "enabled" }, "iscsi": { "100": "enabled" }, "isns": { "100": "enabled" }, "jabber": { "100": "enabled" }, "jetty": { "100": "enabled" }, "jockey": { "100": "enabled" }, "journalctl": { "100": "enabled" }, "kdbus": { "100": "enabled" }, "kdump": { "100": "enabled" }, "kdumpgui": { "100": "enabled" }, "keepalived": { "100": "enabled" }, "kerberos": { "100": "enabled" }, "keyboardd": { "100": "enabled" }, "keystone": { "100": "enabled" }, "kismet": { "100": "enabled" }, "kmscon": { "100": "enabled" }, "kpatch": { "100": "enabled" }, "ksmtuned": { "100": "enabled" }, "ktalk": { "100": "enabled" }, "l2tp": { "100": "enabled" }, "ldap": { "100": "enabled" }, "libraries": { "100": "enabled" }, "likewise": { "100": "enabled" }, "linuxptp": { "100": "enabled" }, "lircd": { "100": "enabled" }, "livecd": { "100": "enabled" }, "lldpad": { "100": "enabled" }, "loadkeys": { "100": "enabled" }, "locallogin": { "100": "enabled" }, "lockdev": { "100": "enabled" }, "logadm": { "100": "enabled" }, "logging": { "100": "enabled" }, "logrotate": { "100": "enabled" }, "logwatch": { "100": "enabled" }, "lpd": { "100": "enabled" }, "lsm": { "100": "enabled" }, "lttng-tools": { "100": "enabled" }, "lvm": { "100": "enabled" }, "mailman": { "100": "enabled" }, "mailscanner": { "100": "enabled" }, "man2html": { "100": "enabled" }, "mandb": { "100": "enabled" }, "mcelog": { "100": "enabled" }, "mediawiki": { "100": "enabled" }, "memcached": { "100": "enabled" }, "milter": { "100": "enabled" }, "minidlna": { "100": "enabled" }, "minissdpd": { "100": "enabled" }, "mip6d": { "100": "enabled" }, "mirrormanager": { "100": "enabled" }, "miscfiles": { "100": "enabled" }, "mock": { "100": "enabled" }, "modemmanager": { "100": "enabled" }, "modutils": { "100": "enabled" }, "mojomojo": { "100": "enabled" }, "mon_statd": { "100": "enabled" }, "mongodb": { "100": "enabled" }, "motion": { "100": "enabled" }, "mount": { "100": "enabled" }, "mozilla": { "100": "enabled" }, "mpd": { "100": "enabled" }, "mplayer": { "100": "enabled" }, "mrtg": { "100": "enabled" }, "mta": { "100": "enabled" }, "munin": { "100": "enabled" }, "mysql": { "100": "enabled" }, "mythtv": { "100": "enabled" }, "naemon": { "100": "enabled" }, "nagios": { "100": "enabled" }, "namespace": { "100": "enabled" }, "ncftool": { "100": "enabled" }, "netlabel": { "100": "enabled" }, "netutils": { "100": "enabled" }, "networkmanager": { "100": "enabled" }, "ninfod": { "100": "enabled" }, "nis": { "100": "enabled" }, "nova": { "100": "enabled" }, "nscd": { "100": "enabled" }, "nsd": { "100": "enabled" }, "nslcd": { "100": "enabled" }, "ntop": { "100": "enabled" }, "ntp": { "100": "enabled" }, "numad": { "100": "enabled" }, "nut": { "100": "enabled" }, "nx": { "100": "enabled" }, "obex": { "100": "enabled" }, "oddjob": { "100": "enabled" }, "opafm": { "100": "enabled" }, "openct": { "100": "enabled" }, "opendnssec": { "100": "enabled" }, "openfortivpn": { "100": "enabled" }, "openhpid": { "100": "enabled" }, "openshift": { "100": "enabled" }, "openshift-origin": { "100": "enabled" }, "opensm": { "100": "enabled" }, "openvpn": { "100": "enabled" }, "openvswitch": { "100": "enabled" }, "openwsman": { "100": "enabled" }, "oracleasm": { "100": "enabled" }, "osad": { "100": "enabled" }, "pads": { "100": "enabled" }, "passenger": { "100": "enabled" }, "pcmcia": { "100": "enabled" }, "pcp": { "100": "enabled" }, "pcscd": { "100": "enabled" }, "pdns": { "100": "enabled" }, "pegasus": { "100": "enabled" }, "permissivedomains": { "100": "enabled" }, "pesign": { "100": "enabled" }, "pingd": { "100": "enabled" }, "piranha": { "100": "enabled" }, "pkcs": { "100": "enabled" }, "pkcs11proxyd": { "100": "enabled" }, "pki": { "100": "enabled" }, "plymouthd": { "100": "enabled" }, "podsleuth": { "100": "enabled" }, "policykit": { "100": "enabled" }, "polipo": { "100": "enabled" }, "portmap": { "100": "enabled" }, "portreserve": { "100": "enabled" }, "postfix": { "100": "enabled" }, "postgresql": { "100": "enabled" }, "postgrey": { "100": "enabled" }, "ppp": { "100": "enabled" }, "prelink": { "100": "enabled" }, "prelude": { "100": "enabled" }, "privoxy": { "100": "enabled" }, "procmail": { "100": "enabled" }, "prosody": { "100": "enabled" }, "psad": { "100": "enabled" }, "ptchown": { "100": "enabled" }, "publicfile": { "100": "enabled" }, "pulseaudio": { "100": "enabled" }, "puppet": { "100": "enabled" }, "pwauth": { "100": "enabled" }, "qmail": { "100": "enabled" }, "qpid": { "100": "enabled" }, "quantum": { "100": "enabled" }, "quota": { "100": "enabled" }, "rabbitmq": { "100": "enabled" }, "radius": { "100": "enabled" }, "radvd": { "100": "enabled" }, "raid": { "100": "enabled" }, "rasdaemon": { "100": "enabled" }, "rdisc": { "100": "enabled" }, "readahead": { "100": "enabled" }, "realmd": { "100": "enabled" }, "redis": { "100": "enabled" }, "remotelogin": { "100": "enabled" }, "rhcs": { "100": "enabled" }, "rhev": { "100": "enabled" }, "rhgb": { "100": "enabled" }, "rhnsd": { "100": "enabled" }, "rhsmcertd": { "100": "enabled" }, "ricci": { "100": "enabled" }, "rkhunter": { "100": "enabled" }, "rkt": { "100": "enabled" }, "rlogin": { "100": "enabled" }, "rngd": { "100": "enabled" }, "rolekit": { "100": "enabled" }, "roundup": { "100": "enabled" }, "rpc": { "100": "enabled" }, "rpcbind": { "100": "enabled" }, "rpm": { "100": "enabled" }, "rrdcached": { "100": "enabled" }, "rshd": { "100": "enabled" }, "rssh": { "100": "enabled" }, "rsync": { "100": "enabled" }, "rtas": { "100": "enabled" }, "rtkit": { "100": "enabled" }, "rwho": { "100": "enabled" }, "samba": { "100": "enabled" }, "sambagui": { "100": "enabled" }, "sandboxX": { "100": "enabled" }, "sanlock": { "100": "enabled" }, "sasl": { "100": "enabled" }, "sbd": { "100": "enabled" }, "sblim": { "100": "enabled" }, "screen": { "100": "enabled" }, "secadm": { "100": "enabled" }, "sectoolm": { "100": "enabled" }, "selinuxutil": { "100": "enabled" }, "sendmail": { "100": "enabled" }, "sensord": { "100": "enabled" }, "setrans": { "100": "enabled" }, "setroubleshoot": { "100": "enabled" }, "seunshare": { "100": "enabled" }, "sge": { "100": "enabled" }, "shorewall": { "100": "enabled" }, "slocate": { "100": "enabled" }, "slpd": { "100": "enabled" }, "smartmon": { "100": "enabled" }, "smokeping": { "100": "enabled" }, "smoltclient": { "100": "enabled" }, "smsd": { "100": "enabled" }, "snapper": { "100": "enabled" }, "snmp": { "100": "enabled" }, "snort": { "100": "enabled" }, "sosreport": { "100": "enabled" }, "soundserver": { "100": "enabled" }, "spamassassin": { "100": "enabled" }, "speech-dispatcher": { "100": "enabled" }, "squid": { "100": "enabled" }, "ssh": { "100": "enabled" }, "sslh": { "100": "enabled" }, "sssd": { "100": "enabled" }, "staff": { "100": "enabled" }, "stapserver": { "100": "enabled" }, "stratisd": { "100": "enabled" }, "stunnel": { "100": "enabled" }, "su": { "100": "enabled" }, "sudo": { "100": "enabled" }, "svnserve": { "100": "enabled" }, "swift": { "100": "enabled" }, "sysadm": { "100": "enabled" }, "sysadm_secadm": { "100": "enabled" }, "sysnetwork": { "100": "enabled" }, "sysstat": { "100": "enabled" }, "systemd": { "100": "enabled" }, "tangd": { "100": "enabled" }, "targetd": { "100": "enabled" }, "tcpd": { "100": "enabled" }, "tcsd": { "100": "enabled" }, "telepathy": { "100": "enabled" }, "telnet": { "100": "enabled" }, "tftp": { "100": "enabled" }, "tgtd": { "100": "enabled" }, "thin": { "100": "enabled" }, "thumb": { "100": "enabled" }, "timedatex": { "100": "enabled" }, "tlp": { "100": "enabled" }, "tmpreaper": { "100": "enabled" }, "tomcat": { "100": "enabled" }, "tor": { "100": "enabled" }, "tuned": { "100": "enabled" }, "tvtime": { "100": "enabled" }, "udev": { "100": "enabled" }, "ulogd": { "100": "enabled" }, "uml": { "100": "enabled" }, "unconfined": { "100": "enabled" }, "unconfineduser": { "100": "enabled" }, "unlabelednet": { "100": "enabled" }, "unprivuser": { "100": "enabled" }, "updfstab": { "100": "enabled" }, "usbmodules": { "100": "enabled" }, "usbmuxd": { "100": "enabled" }, "userdomain": { "100": "enabled" }, "userhelper": { "100": "enabled" }, "usermanage": { "100": "enabled" }, "usernetctl": { "100": "enabled" }, "uucp": { "100": "enabled" }, "uuidd": { "100": "enabled" }, "varnishd": { "100": "enabled" }, "vdagent": { "100": "enabled" }, "vhostmd": { "100": "enabled" }, "virt": { "100": "enabled" }, "vlock": { "100": "enabled" }, "vmtools": { "100": "enabled" }, "vmware": { "100": "enabled" }, "vnstatd": { "100": "enabled" }, "vpn": { "100": "enabled" }, "w3c": { "100": "enabled" }, "watchdog": { "100": "enabled" }, "wdmd": { "100": "enabled" }, "webadm": { "100": "enabled" }, "webalizer": { "100": "enabled" }, "wine": { "100": "enabled" }, "wireshark": { "100": "enabled" }, "xen": { "100": "enabled" }, "xguest": { "100": "enabled" }, "xserver": { "100": "enabled" }, "zabbix": { "100": "enabled" }, "zarafa": { "100": "enabled" }, "zebra": { "100": "enabled" }, "zoneminder": { "100": "enabled" }, "zosremote": { "100": "enabled" } }, "selinux_priorities": true }, "changed": false } TASK [fedora.linux_system_roles.selinux : include_tasks] *********************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:107 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.521) 0:00:27.812 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } META: role_complete for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:31 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.105) 0:00:27.918 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8-y.qcow2.snap => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'run_as_user': 'user1', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd1'}, 'spec': {'containers': [{'name': 'httpd1', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8080}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd1'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd1-create'}}]}}}) included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8-y.qcow2.snap => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd2'}, 'spec': {'containers': [{'name': 'httpd2', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8081}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd2'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd2-create'}}]}}}) included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8-y.qcow2.snap => (item={'state': 'started', 'kube_file_src': '/tmp/lsr_podman_k4vfy_7n.yml'}) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:13 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.094) 0:00:28.013 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_kube_spec": { "debug": true, "log_level": "debug", "state": "started" }, "__podman_kube_str": "apiVersion: v1\nkind: Pod\nmetadata:\n labels:\n app: test\n io.containers.autoupdate: registry\n name: httpd1\nspec:\n containers:\n - command:\n - /bin/busybox-extras\n - httpd\n - -f\n - -p\n - 80\n image: quay.io/libpod/testimage:20210610\n name: httpd1\n ports:\n - containerPort: 80\n hostPort: 8080\n volumeMounts:\n - mountPath: /var/www:Z\n name: www\n - mountPath: /var/httpd-create:Z\n name: create\n workingDir: /var/www\n volumes:\n - hostPath:\n path: /tmp/httpd1\n name: www\n - hostPath:\n path: /tmp/httpd1-create\n name: create\n" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:25 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.058) 0:00:28.071 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_kube": { "apiVersion": "v1", "kind": "Pod", "metadata": { "labels": { "app": "test", "io.containers.autoupdate": "registry" }, "name": "httpd1" }, "spec": { "containers": [ { "command": [ "/bin/busybox-extras", "httpd", "-f", "-p", 80 ], "image": "quay.io/libpod/testimage:20210610", "name": "httpd1", "ports": [ { "containerPort": 80, "hostPort": 8080 } ], "volumeMounts": [ { "mountPath": "/var/www:Z", "name": "www" }, { "mountPath": "/var/httpd-create:Z", "name": "create" } ], "workingDir": "/var/www" } ], "volumes": [ { "hostPath": { "path": "/tmp/httpd1" }, "name": "www" }, { "hostPath": { "path": "/tmp/httpd1-create" }, "name": "create" } ] } }, "__podman_kube_file": "", "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "user1" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:35 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.056) 0:00:28.127 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_kube_name": "httpd1", "__podman_rootless": true }, "changed": false } TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:40 Wednesday 27 July 2022 21:16:23 +0000 (0:00:00.041) 0:00:28.169 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "getent_passwd": { "user1": [ "x", "1001", "1001", "", "/home/user1", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:46 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.447) 0:00:28.617 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if no kube spec is given] ******** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:53 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.028) 0:00:28.645 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:62 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.029) 0:00:28.674 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_group": "1001", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/user1", "__podman_xdg_runtime_dir": "/run/user/1001" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:78 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.055) 0:00:28.730 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_kube_path": "/home/user1/.config/containers/ansible-kubernetes.d" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:82 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.035) 0:00:28.766 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_kube_file": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get service name using systemd-escape] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:86 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.038) 0:00:28.804 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "cmd": [ "systemd-escape", "--template", "podman-kube@.service", "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" ], "delta": "0:00:00.007253", "end": "2022-07-27 17:16:24.094928", "rc": 0, "start": "2022-07-27 17:16:24.087675" } STDOUT: podman-kube@-home-user1-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service TASK [fedora.linux_system_roles.podman : Cleanup containers and services] ****** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:93 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.386) 0:00:29.190 ******** skipping: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update containers and services] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:97 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.027) 0:00:29.218 ******** included: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml for /cache/rhel-8-y.qcow2.snap TASK [fedora.linux_system_roles.podman : Check if user is lingering] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:1 Wednesday 27 July 2022 21:16:24 +0000 (0:00:00.057) 0:00:29.275 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Enable lingering if needed] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:7 Wednesday 27 July 2022 21:16:25 +0000 (0:00:00.369) 0:00:29.645 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "cmd": [ "loginctl", "enable-linger", "user1" ], "delta": "0:00:00.026354", "end": "2022-07-27 17:16:24.984022", "rc": 0, "start": "2022-07-27 17:16:24.957668" } TASK [fedora.linux_system_roles.podman : Get the host mount volumes] *********** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:13 Wednesday 27 July 2022 21:16:25 +0000 (0:00:00.431) 0:00:30.076 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "ansible_facts": { "__podman_volumes": [ "/tmp/httpd1", "/tmp/httpd1-create" ] }, "changed": false } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24 Wednesday 27 July 2022 21:16:25 +0000 (0:00:00.059) 0:00:30.136 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) changed: [/cache/rhel-8-y.qcow2.snap] => (item=/tmp/httpd1) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 23, "state": "directory", "uid": 1001 } changed: [/cache/rhel-8-y.qcow2.snap] => (item=/tmp/httpd1-create) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1-create", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1-create", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 Wednesday 27 July 2022 21:16:26 +0000 (0:00:00.725) 0:00:30.861 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item=quay.io/libpod/testimage:20210610) => { "actions": [ "Pulled image quay.io/libpod/testimage:20210610" ], "ansible_loop_var": "item", "changed": true, "image": [ { "Annotations": {}, "Architecture": "amd64", "Author": "", "Comment": "", "Config": { "Cmd": [ "/bin/echo", "This container is intended for podman CI testing" ], "Env": [ "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" ], "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "WorkingDir": "/home/podman" }, "Created": "2021-06-10T18:55:43.049643585Z", "Digest": "sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "GraphDriver": { "Data": { "UpperDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/diff", "WorkDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/work" }, "Name": "overlay" }, "History": [ { "created": "2021-06-10T18:55:42.831917915Z", "created_by": "/bin/sh -c apk add busybox-extras", "empty_layer": true }, { "created": "2021-06-10T18:55:43.005956291Z", "created_by": "/bin/sh -c #(nop) ADD multi:0ed825786ec12498034356148303d2e6dfd4698131f4b5d4599e5eafa2ab71bd in /home/podman/ ", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006000972Z", "created_by": "/bin/sh -c #(nop) LABEL created_by=test/system/build-testimage", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006019818Z", "created_by": "/bin/sh -c #(nop) LABEL created_at=2021-06-10T18:55:36Z", "empty_layer": true }, { "created": "2021-06-10T18:55:43.028748885Z", "created_by": "/bin/sh -c #(nop) WORKDIR /home/podman", "empty_layer": true }, { "comment": "FROM docker.io/amd64/alpine:3.13.5", "created": "2021-06-10T18:55:43.160651456Z", "created_by": "/bin/sh -c #(nop) CMD [\"/bin/echo\", \"This container is intended for podman CI testing\"]" } ], "Id": "9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f", "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "ManifestType": "application/vnd.docker.distribution.manifest.v2+json", "NamesHistory": [ "quay.io/libpod/testimage:20210610" ], "Os": "linux", "Parent": "", "RepoDigests": [ "quay.io/libpod/testimage@sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "quay.io/libpod/testimage@sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" ], "RepoTags": [ "quay.io/libpod/testimage:20210610" ], "RootFS": { "Layers": [ "sha256:f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e" ], "Type": "layers" }, "Size": 7987860, "User": "", "Version": "", "VirtualSize": 7987860 } ], "item": "quay.io/libpod/testimage:20210610", "podman_actions": [ "/bin/podman image ls quay.io/libpod/testimage:20210610 --format json", "/bin/podman pull quay.io/libpod/testimage:20210610 -q", "/bin/podman inspect 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f --format json" ], "warnings": [ "Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually" ] } [WARNING]: Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually TASK [fedora.linux_system_roles.podman : Check the kubernetes yaml file] ******* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:51 Wednesday 27 July 2022 21:16:28 +0000 (0:00:02.136) 0:00:32.998 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "failed_when_result": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Ensure the kubernetes directory is present] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:59 Wednesday 27 July 2022 21:16:29 +0000 (0:00:00.399) 0:00:33.398 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "gid": 1001, "group": "user1", "mode": "0700", "owner": "user1", "path": "/home/user1/.config/containers/ansible-kubernetes.d", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 6, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure kubernetes yaml files are present] *** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:68 Wednesday 27 July 2022 21:16:29 +0000 (0:00:00.390) 0:00:33.788 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "checksum": "f36f5b3fd8752a059ae217d04f65ba46b054d773", "dest": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml", "gid": 1001, "group": "user1", "md5sum": "47ccac83d9de77d9645ae1ef4733269a", "mode": "0600", "owner": "user1", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 721, "src": "/root/.ansible/tmp/ansible-tmp-1658956589.5550935-34901-131971261030937/source", "state": "file", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Update containers/pods] *************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 Wednesday 27 July 2022 21:16:30 +0000 (0:00:00.675) 0:00:34.463 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) changed: [/cache/rhel-8-y.qcow2.snap] => { "actions": [ "/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" ], "changed": true } STDOUT: Pod: c9a95aae6466f3877cb64fe6b8369ccee76173444f826c4c2cace40347562d09 Container: 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 STDERR: time="2022-07-27T17:16:29-04:00" level=info msg="/bin/podman filtering at log level debug" time="2022-07-27T17:16:29-04:00" level=debug msg="Called kube.PersistentPreRunE(/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml)" time="2022-07-27T17:16:29-04:00" level=debug msg="Merged system config \"/usr/share/containers/containers.conf\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Merged system config \"/etc/containers/containers.conf.d/50-systemroles.conf\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Initializing boltdb state at /home/user1/.local/share/containers/storage/libpod/bolt_state.db" time="2022-07-27T17:16:29-04:00" level=debug msg="Using graph driver overlay" time="2022-07-27T17:16:29-04:00" level=debug msg="Using graph root /home/user1/.local/share/containers/storage" time="2022-07-27T17:16:29-04:00" level=debug msg="Using run root /run/user/1001/containers" time="2022-07-27T17:16:29-04:00" level=debug msg="Using static dir /home/user1/.local/share/containers/storage/libpod" time="2022-07-27T17:16:29-04:00" level=debug msg="Using tmp dir /run/user/1001/libpod/tmp" time="2022-07-27T17:16:29-04:00" level=debug msg="Using volume path /home/user1/.local/share/containers/storage/volumes" time="2022-07-27T17:16:29-04:00" level=debug msg="Set libpod namespace to \"\"" time="2022-07-27T17:16:29-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that native-diff is usable" time="2022-07-27T17:16:29-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T17:16:29-04:00" level=debug msg="Initializing event backend file" time="2022-07-27T17:16:29-04:00" level=debug msg="Configured OCI runtime crun initialization failed: no valid executable found for OCI runtime crun: invalid argument" time="2022-07-27T17:16:29-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2022-07-27T17:16:29-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2022-07-27T17:16:29-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2022-07-27T17:16:29-04:00" level=debug msg="Using OCI runtime \"/usr/bin/runc\"" time="2022-07-27T17:16:29-04:00" level=info msg="Setting parallel job count to 13" time="2022-07-27T17:16:29-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:29-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:29-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:29-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:29-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:29-04:00" level=debug msg="FROM \"scratch\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that overlay is not supported" time="2022-07-27T17:16:29-04:00" level=debug msg="Check for idmapped mounts support " time="2022-07-27T17:16:29-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:16:29-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T17:16:29-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T17:16:29-04:00" level=debug msg="overlay: test mount indicated that volatile is being used" time="2022-07-27T17:16:29-04:00" level=debug msg="overlay: mount_data=lowerdir=/home/user1/.local/share/containers/storage/overlay/314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca/empty,upperdir=/home/user1/.local/share/containers/storage/overlay/314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca/diff,workdir=/home/user1/.local/share/containers/storage/overlay/314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca/work,,userxattr,volatile,context=\"system_u:object_r:container_file_t:s0:c141,c386\"" time="2022-07-27T17:16:29-04:00" level=debug msg="Container ID: cc650eeedaf622118da37e1320713242c9d8121126d8d3997e9c1bea5310d7d6" time="2022-07-27T17:16:29-04:00" level=debug msg="Parsed Step: {Env:[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin] Command:copy Args:[/usr/libexec/podman/catatonit /catatonit] Flags:[] Attrs:map[] Message:COPY /usr/libexec/podman/catatonit /catatonit Original:COPY /usr/libexec/podman/catatonit /catatonit}" time="2022-07-27T17:16:29-04:00" level=debug msg="COPY []string(nil), imagebuilder.Copy{FromFS:false, From:\"\", Src:[]string{\"/usr/libexec/podman/catatonit\"}, Dest:\"/catatonit\", Download:false, Chown:\"\", Chmod:\"\"}" time="2022-07-27T17:16:30-04:00" level=debug msg="added content file:8baa0179bee9f068b658fb94bbc7e04df57981b80ed68a6c628f73699ce1f8e4" time="2022-07-27T17:16:30-04:00" level=debug msg="Parsed Step: {Env:[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin] Command:entrypoint Args:[/catatonit -P] Flags:[] Attrs:map[json:true] Message:ENTRYPOINT /catatonit -P Original:ENTRYPOINT [\"/catatonit\", \"-P\"]}" time="2022-07-27T17:16:30-04:00" level=debug msg="COMMIT localhost/podman-pause:4.1.1-1657274017" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.1.1-1657274017\"" time="2022-07-27T17:16:30-04:00" level=debug msg="COMMIT \"containers-storage:[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.1.1-1657274017\"" time="2022-07-27T17:16:30-04:00" level=debug msg="committing image with reference \"containers-storage:[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.1.1-1657274017\" is allowed by policy" time="2022-07-27T17:16:30-04:00" level=debug msg="layer list: [\"314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca\"]" time="2022-07-27T17:16:30-04:00" level=debug msg="using \"/var/tmp/buildah831872776\" to hold temporary data" time="2022-07-27T17:16:30-04:00" level=debug msg="Tar with options on /home/user1/.local/share/containers/storage/overlay/314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca/diff" time="2022-07-27T17:16:30-04:00" level=debug msg="layer \"314f4906169b6da9eed0f7029611f2c34d292bc7f232ce5a3eeae4858f1904ca\" size is 737792 bytes, uncompressed digest sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2, possibly-compressed digest sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2" time="2022-07-27T17:16:30-04:00" level=debug msg="OCIv1 config = {\"created\":\"2022-07-27T21:16:30.110073501Z\",\"architecture\":\"amd64\",\"os\":\"linux\",\"config\":{\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Entrypoint\":[\"/catatonit\",\"-P\"],\"Labels\":{\"io.buildah.version\":\"1.26.2\"}},\"rootfs\":{\"type\":\"layers\",\"diff_ids\":[\"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\"]},\"history\":[{\"created\":\"2022-07-27T21:16:30.109330372Z\",\"created_by\":\"/bin/sh -c #(nop) COPY file:8baa0179bee9f068b658fb94bbc7e04df57981b80ed68a6c628f73699ce1f8e4 in /catatonit \",\"empty_layer\":true},{\"created\":\"2022-07-27T21:16:30.113153483Z\",\"created_by\":\"/bin/sh -c #(nop) ENTRYPOINT [\\\"/catatonit\\\", \\\"-P\\\"]\"}]}" time="2022-07-27T17:16:30-04:00" level=debug msg="OCIv1 manifest = {\"schemaVersion\":2,\"mediaType\":\"application/vnd.oci.image.manifest.v1+json\",\"config\":{\"mediaType\":\"application/vnd.oci.image.config.v1+json\",\"digest\":\"sha256:797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\",\"size\":668},\"layers\":[{\"mediaType\":\"application/vnd.oci.image.layer.v1.tar\",\"digest\":\"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\",\"size\":737792}],\"annotations\":{\"org.opencontainers.image.base.digest\":\"\",\"org.opencontainers.image.base.name\":\"\"}}" time="2022-07-27T17:16:30-04:00" level=debug msg="Docker v2s2 config = {\"created\":\"2022-07-27T21:16:30.110073501Z\",\"container\":\"cc650eeedaf622118da37e1320713242c9d8121126d8d3997e9c1bea5310d7d6\",\"container_config\":{\"Hostname\":\"\",\"Domainname\":\"\",\"User\":\"\",\"AttachStdin\":false,\"AttachStdout\":false,\"AttachStderr\":false,\"Tty\":false,\"OpenStdin\":false,\"StdinOnce\":false,\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Cmd\":[],\"Image\":\"\",\"Volumes\":{},\"WorkingDir\":\"\",\"Entrypoint\":[\"/catatonit\",\"-P\"],\"OnBuild\":[],\"Labels\":{\"io.buildah.version\":\"1.26.2\"}},\"config\":{\"Hostname\":\"\",\"Domainname\":\"\",\"User\":\"\",\"AttachStdin\":false,\"AttachStdout\":false,\"AttachStderr\":false,\"Tty\":false,\"OpenStdin\":false,\"StdinOnce\":false,\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Cmd\":[],\"Image\":\"\",\"Volumes\":{},\"WorkingDir\":\"\",\"Entrypoint\":[\"/catatonit\",\"-P\"],\"OnBuild\":[],\"Labels\":{\"io.buildah.version\":\"1.26.2\"}},\"architecture\":\"amd64\",\"os\":\"linux\",\"rootfs\":{\"type\":\"layers\",\"diff_ids\":[\"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\"]},\"history\":[{\"created\":\"2022-07-27T21:16:30.109330372Z\",\"created_by\":\"/bin/sh -c #(nop) COPY file:8baa0179bee9f068b658fb94bbc7e04df57981b80ed68a6c628f73699ce1f8e4 in /catatonit \",\"empty_layer\":true},{\"created\":\"2022-07-27T21:16:30.113153483Z\",\"created_by\":\"/bin/sh -c #(nop) ENTRYPOINT [\\\"/catatonit\\\", \\\"-P\\\"]\"}]}" time="2022-07-27T17:16:30-04:00" level=debug msg="Docker v2s2 manifest = {\"schemaVersion\":2,\"mediaType\":\"application/vnd.docker.distribution.manifest.v2+json\",\"config\":{\"mediaType\":\"application/vnd.docker.container.image.v1+json\",\"size\":1342,\"digest\":\"sha256:820ed727e2c97581f8b98909ec036e762eff8507ffd18506f49dc135db8d0619\"},\"layers\":[{\"mediaType\":\"application/vnd.docker.image.rootfs.diff.tar\",\"size\":737792,\"digest\":\"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\"}]}" time="2022-07-27T17:16:30-04:00" level=debug msg="Using blob info cache at /home/user1/.local/share/containers/cache/blob-info-cache-v1.boltdb" time="2022-07-27T17:16:30-04:00" level=debug msg="IsRunningImageAllowed for image containers-storage:" time="2022-07-27T17:16:30-04:00" level=debug msg=" Using transport \"containers-storage\" policy section " time="2022-07-27T17:16:30-04:00" level=debug msg=" Requirement 0: allowed" time="2022-07-27T17:16:30-04:00" level=debug msg="Overall: allowed" time="2022-07-27T17:16:30-04:00" level=debug msg="start reading config" time="2022-07-27T17:16:30-04:00" level=debug msg="finished reading config" time="2022-07-27T17:16:30-04:00" level=debug msg="Manifest has MIME type application/vnd.oci.image.manifest.v1+json, ordered candidate list [application/vnd.oci.image.manifest.v1+json, application/vnd.docker.distribution.manifest.v2+json, application/vnd.docker.distribution.manifest.v1+prettyjws, application/vnd.docker.distribution.manifest.v1+json]" time="2022-07-27T17:16:30-04:00" level=debug msg="... will first try using the original manifest unmodified" time="2022-07-27T17:16:30-04:00" level=debug msg="reading layer \"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\"" time="2022-07-27T17:16:30-04:00" level=debug msg="No compression detected" time="2022-07-27T17:16:30-04:00" level=debug msg="Using original blob without modification" time="2022-07-27T17:16:30-04:00" level=debug msg="Cached value indicated that overlay is not supported" time="2022-07-27T17:16:30-04:00" level=debug msg="Check for idmapped mounts support " time="2022-07-27T17:16:30-04:00" level=debug msg="Applying tar in /home/user1/.local/share/containers/storage/overlay/a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2/diff" time="2022-07-27T17:16:30-04:00" level=debug msg="finished reading layer \"sha256:a427e42219ff9c3f4bcd3826bd1066c122bfd16d19a6df7b6330d635268116e2\"" time="2022-07-27T17:16:30-04:00" level=debug msg="No compression detected" time="2022-07-27T17:16:30-04:00" level=debug msg="Using original blob without modification" time="2022-07-27T17:16:30-04:00" level=debug msg="setting image creation date to 2022-07-27 21:16:30.110073501 +0000 UTC" time="2022-07-27T17:16:30-04:00" level=debug msg="created new image ID \"797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="saved image metadata \"{\\\"signatures-sizes\\\":{\\\"sha256:32d74dc13867dd1de45b90ddb01221da5818874ca561712e1a097edd8ddaa144\\\":[]}}\"" time="2022-07-27T17:16:30-04:00" level=debug msg="added name \"localhost/podman-pause:4.1.1-1657274017\" to image \"797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"localhost/podman-pause:4.1.1-1657274017\" as \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"localhost/podman-pause:4.1.1-1657274017\" as \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6)" time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.1.1-1657274017\"" time="2022-07-27T17:16:30-04:00" level=debug msg="printing final image id \"797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Got pod cgroup as /libpod_parent/c9a95aae6466f3877cb64fe6b8369ccee76173444f826c4c2cace40347562d09" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"localhost/podman-pause:4.1.1-1657274017\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"localhost/podman-pause:4.1.1-1657274017\" as \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"localhost/podman-pause:4.1.1-1657274017\" as \"localhost/podman-pause:4.1.1-1657274017\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6)" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6" time="2022-07-27T17:16:30-04:00" level=debug msg="using systemd mode: false" time="2022-07-27T17:16:30-04:00" level=debug msg="setting container name c9a95aae6466-infra" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Allocated lock 1 for container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44" time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:797e1e4b0c60c7d78321962506f26959469b784cb1804b280cb961b342df45d6\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Created container \"bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Container \"bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Container \"bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44\" has run directory \"/run/user/1001/containers/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:16:30-04:00" level=debug msg="Pulling image quay.io/libpod/testimage:20210610 (policy: newer)" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/000-shortnames.conf\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/001-rhel-shortnames.conf\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/002-rhel-shortnames-overrides.conf\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/50-systemroles.conf\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Attempting to pull candidate quay.io/libpod/testimage:20210610 for quay.io/libpod/testimage:20210610" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying to access \"quay.io/libpod/testimage:20210610\"" time="2022-07-27T17:16:30-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /run/user/1001/containers/auth.json" time="2022-07-27T17:16:30-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.config/containers/auth.json" time="2022-07-27T17:16:30-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.docker/config.json" time="2022-07-27T17:16:30-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.dockercfg" time="2022-07-27T17:16:30-04:00" level=debug msg="No credentials for quay.io/libpod/testimage found" time="2022-07-27T17:16:30-04:00" level=debug msg="Using registries.d directory /etc/containers/registries.d for sigstore configuration" time="2022-07-27T17:16:30-04:00" level=debug msg=" Using \"default-docker\" configuration" time="2022-07-27T17:16:30-04:00" level=debug msg=" No signature storage configuration found for quay.io/libpod/testimage:20210610, using built-in default file:///home/user1/.local/share/containers/sigstore" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking for TLS certificates and private keys in /etc/docker/certs.d/quay.io" time="2022-07-27T17:16:30-04:00" level=debug msg="GET https://quay.io/v2/" time="2022-07-27T17:16:30-04:00" level=debug msg="Ping https://quay.io/v2/ status 401" time="2022-07-27T17:16:30-04:00" level=debug msg="GET https://quay.io/v2/auth?scope=repository%3Alibpod%2Ftestimage%3Apull&service=quay.io" time="2022-07-27T17:16:30-04:00" level=debug msg="Increasing token expiration to: 60 seconds" time="2022-07-27T17:16:30-04:00" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/20210610" time="2022-07-27T17:16:30-04:00" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.list.v2+json\"" time="2022-07-27T17:16:30-04:00" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" time="2022-07-27T17:16:30-04:00" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.v2+json\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Skipping pull candidate quay.io/libpod/testimage:20210610 as the image is not newer (pull policy newer)" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:16:30-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:16:30-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:16:30-04:00" level=debug msg="using systemd mode: false" time="2022-07-27T17:16:30-04:00" level=debug msg="adding container to pod httpd1" time="2022-07-27T17:16:30-04:00" level=debug msg="setting container name httpd1-httpd1" time="2022-07-27T17:16:30-04:00" level=debug msg="Loading default seccomp profile" time="2022-07-27T17:16:30-04:00" level=info msg="Sysctl net.ipv4.ping_group_range=0 0 ignored in containers.conf, since Network Namespace set to host" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /proc" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /dev" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /dev/pts" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /dev/mqueue" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /sys" time="2022-07-27T17:16:30-04:00" level=debug msg="Adding mount /sys/fs/cgroup" time="2022-07-27T17:16:30-04:00" level=debug msg="Allocated lock 2 for container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5" time="2022-07-27T17:16:30-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Created container \"03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Container \"03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Container \"03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5\" has run directory \"/run/user/1001/containers/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Strongconnecting node 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5" time="2022-07-27T17:16:30-04:00" level=debug msg="Pushed 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 onto stack" time="2022-07-27T17:16:30-04:00" level=debug msg="Recursing to successor node bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44" time="2022-07-27T17:16:30-04:00" level=debug msg="Strongconnecting node bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44" time="2022-07-27T17:16:30-04:00" level=debug msg="Pushed bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 onto stack" time="2022-07-27T17:16:30-04:00" level=debug msg="Finishing node bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44. Popped bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 off stack" time="2022-07-27T17:16:30-04:00" level=debug msg="Finishing node 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5. Popped 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 off stack" time="2022-07-27T17:16:30-04:00" level=debug msg="overlay: mount_data=lowerdir=/home/user1/.local/share/containers/storage/overlay/l/QPUIBD6NJ3J7K63N4SQAIZYHVV,upperdir=/home/user1/.local/share/containers/storage/overlay/013dc6e11464b2d92cc40e3ea7d890835153ebb5e736c6f9288f73314f173e1e/diff,workdir=/home/user1/.local/share/containers/storage/overlay/013dc6e11464b2d92cc40e3ea7d890835153ebb5e736c6f9288f73314f173e1e/work,,userxattr,context=\"system_u:object_r:container_file_t:s0:c619,c685\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Mounted container \"bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44\" at \"/home/user1/.local/share/containers/storage/overlay/013dc6e11464b2d92cc40e3ea7d890835153ebb5e736c6f9288f73314f173e1e/merged\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Made network namespace at /run/user/1001/netns/netns-48137a90-dfa2-2ad1-43cd-b775e5fa90e7 for container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44" time="2022-07-27T17:16:30-04:00" level=debug msg="Created root filesystem for container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 at /home/user1/.local/share/containers/storage/overlay/013dc6e11464b2d92cc40e3ea7d890835153ebb5e736c6f9288f73314f173e1e/merged" time="2022-07-27T17:16:30-04:00" level=debug msg="slirp4netns command: /bin/slirp4netns --disable-host-loopback --mtu=65520 --enable-sandbox --enable-seccomp --enable-ipv6 -c -e 3 -r 4 --netns-type=path /run/user/1001/netns/netns-48137a90-dfa2-2ad1-43cd-b775e5fa90e7 tap0" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=\"Starting parent driver\"\ntime=\"2022-07-27T17:16:30-04:00\" level=info msg=\"opaque=map[builtin.readypipepath:/run/user/1001/libpod/tmp/rootlessport2956277055/.bp-ready.pipe builtin.socketpath:/run/user/1001/libpod/tmp/rootlessport2956277055/.bp.sock]\"\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=\"Starting child driver in child netns (\\\"/proc/self/exe\\\" [rootlessport-child])\"\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=\"Waiting for initComplete\"\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=\"initComplete is closed; parent and child established the communication channel\"\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=\"Exposing ports [{ 80 8080 1 tcp}]\"\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:16:30-04:00\" level=info msg=Ready\n" time="2022-07-27T17:16:30-04:00" level=debug msg="rootlessport is ready" time="2022-07-27T17:16:30-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2022-07-27T17:16:30-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2022-07-27T17:16:30-04:00" level=debug msg="Workdir \"/\" resolved to host path \"/home/user1/.local/share/containers/storage/overlay/013dc6e11464b2d92cc40e3ea7d890835153ebb5e736c6f9288f73314f173e1e/merged\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Created OCI spec for container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 at /home/user1/.local/share/containers/storage/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata/config.json" time="2022-07-27T17:16:30-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2022-07-27T17:16:30-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 -u bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 -r /usr/bin/runc -b /home/user1/.local/share/containers/storage/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata -p /run/user/1001/containers/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata/pidfile -n c9a95aae6466-infra --exit-dir /run/user/1001/libpod/tmp/exits --full-attach -l k8s-file:/home/user1/.local/share/containers/storage/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata/ctr.log --log-level debug --syslog --conmon-pidfile /run/user/1001/containers/overlay-containers/bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /home/user1/.local/share/containers/storage --exit-command-arg --runroot --exit-command-arg /run/user/1001/containers --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg cgroupfs --exit-command-arg --tmpdir --exit-command-arg /run/user/1001/libpod/tmp --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg cni --exit-command-arg --volumepath --exit-command-arg /home/user1/.local/share/containers/storage/volumes --exit-command-arg --runtime --exit-command-arg runc --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --events-backend --exit-command-arg file --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44]" time="2022-07-27T17:16:30-04:00" level=info msg="Failed to add conmon to cgroupfs sandbox cgroup: error creating cgroup for cpuset: mkdir /sys/fs/cgroup/cpuset/libpod_parent: permission denied" [conmon:d]: failed to write to /proc/self/oom_score_adj: Permission denied time="2022-07-27T17:16:30-04:00" level=debug msg="Received: 16429" time="2022-07-27T17:16:30-04:00" level=info msg="Got Conmon PID as 16417" time="2022-07-27T17:16:30-04:00" level=debug msg="Created container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 in OCI runtime" time="2022-07-27T17:16:30-04:00" level=debug msg="Starting container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44 with command [/catatonit -P]" time="2022-07-27T17:16:30-04:00" level=debug msg="Started container bf1b024300e2a91df05d2101e04536ab5741ed88afa4931147685aa2534bfc44" time="2022-07-27T17:16:30-04:00" level=debug msg="overlay: mount_data=lowerdir=/home/user1/.local/share/containers/storage/overlay/l/2WKZUDVMVH7SMLYVTG7ZMRXFKE,upperdir=/home/user1/.local/share/containers/storage/overlay/bb993d3a578274730f489b2f7fe5ad9169b7b465f63091fad20d098183526367/diff,workdir=/home/user1/.local/share/containers/storage/overlay/bb993d3a578274730f489b2f7fe5ad9169b7b465f63091fad20d098183526367/work,,userxattr,context=\"system_u:object_r:container_file_t:s0:c619,c685\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Mounted container \"03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5\" at \"/home/user1/.local/share/containers/storage/overlay/bb993d3a578274730f489b2f7fe5ad9169b7b465f63091fad20d098183526367/merged\"" time="2022-07-27T17:16:30-04:00" level=debug msg="Created root filesystem for container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 at /home/user1/.local/share/containers/storage/overlay/bb993d3a578274730f489b2f7fe5ad9169b7b465f63091fad20d098183526367/merged" time="2022-07-27T17:16:30-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2022-07-27T17:16:30-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2022-07-27T17:16:30-04:00" level=debug msg="Workdir \"/var/www\" resolved to a volume or mount" time="2022-07-27T17:16:30-04:00" level=debug msg="Created OCI spec for container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 at /home/user1/.local/share/containers/storage/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata/config.json" time="2022-07-27T17:16:30-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2022-07-27T17:16:30-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 -u 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 -r /usr/bin/runc -b /home/user1/.local/share/containers/storage/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata -p /run/user/1001/containers/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata/pidfile -n httpd1-httpd1 --exit-dir /run/user/1001/libpod/tmp/exits --full-attach -l k8s-file:/home/user1/.local/share/containers/storage/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata/ctr.log --log-level debug --syslog --conmon-pidfile /run/user/1001/containers/overlay-containers/03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /home/user1/.local/share/containers/storage --exit-command-arg --runroot --exit-command-arg /run/user/1001/containers --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg cgroupfs --exit-command-arg --tmpdir --exit-command-arg /run/user/1001/libpod/tmp --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg cni --exit-command-arg --volumepath --exit-command-arg /home/user1/.local/share/containers/storage/volumes --exit-command-arg --runtime --exit-command-arg runc --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --events-backend --exit-command-arg file --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5]" time="2022-07-27T17:16:30-04:00" level=info msg="Failed to add conmon to cgroupfs sandbox cgroup: error creating cgroup for blkio: mkdir /sys/fs/cgroup/blkio/conmon: permission denied" [conmon:d]: failed to write to /proc/self/oom_score_adj: Permission denied time="2022-07-27T17:16:30-04:00" level=debug msg="Received: 16454" time="2022-07-27T17:16:30-04:00" level=info msg="Got Conmon PID as 16442" time="2022-07-27T17:16:30-04:00" level=debug msg="Created container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 in OCI runtime" time="2022-07-27T17:16:30-04:00" level=debug msg="Starting container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5 with command [/bin/busybox-extras httpd -f -p 80]" time="2022-07-27T17:16:30-04:00" level=debug msg="Started container 03ded84d9f99bbcca102d9c94defc1f7a29dc826801ff0db15666170f31203d5" time="2022-07-27T17:16:30-04:00" level=debug msg="Called kube.PersistentPostRunE(/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml)" TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:86 Wednesday 27 July 2022 21:16:31 +0000 (0:00:01.600) 0:00:36.064 ******** ok: [/cache/rhel-8-y.qcow2.snap] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Enable service] *********************** task path: /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:98 Wednesday 27 July 2022 21:16:32 +0000 (0:00:00.592) 0:00:36.656 ******** fatal: [/cache/rhel-8-y.qcow2.snap]: FAILED! => { "changed": false } MSG: Could not find the requested service podman-kube@-home-user1-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service: host TASK [Clean up storage.conf] *************************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:346 Wednesday 27 July 2022 21:16:32 +0000 (0:00:00.566) 0:00:37.223 ******** changed: [/cache/rhel-8-y.qcow2.snap] => { "changed": true, "path": "/etc/containers/storage.conf", "state": "absent" } TASK [Clean up host directories] *********************************************** task path: /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:353 Wednesday 27 July 2022 21:16:33 +0000 (0:00:00.373) 0:00:37.597 ******** changed: [/cache/rhel-8-y.qcow2.snap] => (item=httpd1) => { "ansible_loop_var": "item", "changed": true, "item": "httpd1", "path": "/tmp/httpd1", "state": "absent" } changed: [/cache/rhel-8-y.qcow2.snap] => (item=httpd2) => { "ansible_loop_var": "item", "changed": true, "item": "httpd2", "path": "/tmp/httpd2", "state": "absent" } changed: [/cache/rhel-8-y.qcow2.snap] => (item=httpd3) => { "ansible_loop_var": "item", "changed": true, "item": "httpd3", "path": "/tmp/httpd3", "state": "absent" } PLAY RECAP ********************************************************************* /cache/rhel-8-y.qcow2.snap : ok=57 changed=24 unreachable=0 failed=1 skipped=29 rescued=0 ignored=0 Wednesday 27 July 2022 21:16:34 +0000 (0:00:01.011) 0:00:38.609 ******** =============================================================================== fedora.linux_system_roles.firewall : Install firewalld ------------------ 2.60s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 fedora.linux_system_roles.selinux : Set an SELinux label on a port ------ 2.29s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 fedora.linux_system_roles.podman : Ensure container images are present --- 2.14s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 Install podman from updates-testing ------------------------------------- 2.09s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:139 ----------------------------- Create data files ------------------------------------------------------- 1.94s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:171 ----------------------------- fedora.linux_system_roles.podman : Update containers/pods --------------- 1.60s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 fedora.linux_system_roles.podman : Ensure required packages are installed --- 1.53s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.selinux : Install SELinux tool semanage ------- 1.28s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 fedora.linux_system_roles.firewall : Install python3-firewall ----------- 1.27s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 fedora.linux_system_roles.selinux : Install SELinux python3 tools ------- 1.25s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 Create host directories for data ---------------------------------------- 1.20s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:163 ----------------------------- Gathering Facts --------------------------------------------------------- 1.20s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:3 ------------------------------- Enable podman copr ------------------------------------------------------ 1.12s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:136 ----------------------------- Clean up host directories ----------------------------------------------- 1.01s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:353 ----------------------------- fedora.linux_system_roles.firewall : Enable and start firewalld service --- 1.00s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 fedora.linux_system_roles.selinux : refresh facts ----------------------- 0.80s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 Create user ------------------------------------------------------------- 0.78s /tmp/tmpi7zgzubs/tests/podman/tests_basic.yml:145 ----------------------------- fedora.linux_system_roles.firewall : Configure firewall ----------------- 0.73s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:36 fedora.linux_system_roles.podman : Create host directories -------------- 0.73s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24 fedora.linux_system_roles.podman : Update system registries config file --- 0.72s /tmp/tmpwnjj5ib_/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:21