ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmprng1zc_n executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: rhel-8_setup.yml ***************************************************** 1 plays in /cache/rhel-8_setup.yml PLAY [Setup repos] ************************************************************* META: ran handlers TASK [set up internal repositories] ******************************************** task path: /cache/rhel-8_setup.yml:5 Wednesday 27 July 2022 21:13:06 +0000 (0:00:00.018) 0:00:00.018 ******** changed: [/cache/rhel-8.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8.qcow2] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } changed: [/cache/rhel-8.qcow2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/rhel-8.qcow2 : ok=1 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 Wednesday 27 July 2022 21:13:07 +0000 (0:00:01.341) 0:00:01.359 ******** =============================================================================== set up internal repositories -------------------------------------------- 1.34s /cache/rhel-8_setup.yml:5 ----------------------------------------------------- PLAYBOOK: tests_basic.yml ****************************************************** 1 plays in /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml PLAY [Ensure that the role runs with default parameters] *********************** TASK [Gathering Facts] ********************************************************* task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:3 Wednesday 27 July 2022 21:13:07 +0000 (0:00:00.042) 0:00:01.402 ******** ok: [/cache/rhel-8.qcow2] META: ran handlers TASK [Enable podman copr] ****************************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:136 Wednesday 27 July 2022 21:13:08 +0000 (0:00:01.300) 0:00:02.703 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "cmd": [ "dnf", "copr", "enable", "rhcontainerbot/podman-next", "-y" ], "delta": "0:00:01.152892", "end": "2022-07-27 17:13:10.084278", "rc": 0, "start": "2022-07-27 17:13:08.931386" } STDOUT: Repository successfully enabled. STDERR: Enabling a Copr repository. Please note that this repository is not part of the main distribution, and quality may vary. The Fedora Project does not exercise any power over the contents of this repository beyond the rules outlined in the Copr FAQ at , and packages are not held to any quality or security level. Please do not file bug reports about these packages in Fedora Bugzilla. In case of problems, contact the owner of this repository. TASK [Install podman from updates-testing] ************************************* task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:139 Wednesday 27 July 2022 21:13:11 +0000 (0:00:02.511) 0:00:05.214 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "cmd": [ "dnf", "-y", "install", "podman" ], "delta": "0:00:39.569396", "end": "2022-07-27 17:13:50.064047", "rc": 0, "start": "2022-07-27 17:13:10.494651" } STDOUT: Copr repo for podman-next owned by rhcontainerb 6.7 MB/s | 1.8 MB 00:00 rhel-HighAvailability 9.4 MB/s | 573 kB 00:00 rhel-appstream 32 MB/s | 7.5 MB 00:00 rhel-baseos 50 MB/s | 2.4 MB 00:00 Dependencies resolved. ================================================================================================================================================= Package Arch Version Repository Size ================================================================================================================================================= Installing: podman x86_64 2:4.0.2-6.module+el8.6.0+14673+621cb8be rhel-appstream 13 M Installing dependencies: conmon x86_64 2:2.1.0-1.module+el8.6.0+14673+621cb8be rhel-appstream 55 k container-selinux noarch 2:2.179.1-1.module+el8.6.0+14673+621cb8be rhel-appstream 58 k containernetworking-plugins x86_64 1:1.0.1-2.module+el8.6.0+14673+621cb8be rhel-appstream 18 M containers-common x86_64 2:1-27.module+el8.6.0+14673+621cb8be rhel-appstream 95 k criu x86_64 3.15-3.module+el8.6.0+14673+621cb8be rhel-appstream 518 k fuse-common x86_64 3.3.0-15.el8 rhel-baseos 22 k fuse-overlayfs x86_64 1.8.2-1.module+el8.6.0+14673+621cb8be rhel-appstream 73 k fuse3 x86_64 3.3.0-15.el8 rhel-baseos 54 k fuse3-libs x86_64 3.3.0-15.el8 rhel-baseos 95 k iptables x86_64 1.8.4-22.el8 rhel-baseos 585 k iptables-libs x86_64 1.8.4-22.el8 rhel-baseos 108 k libnet x86_64 1.1.6-15.el8 copr:copr.fedorainfracloud.org:rhcontainerbot:podman-next 66 k libnetfilter_conntrack x86_64 1.0.6-5.el8 rhel-baseos 65 k libnfnetlink x86_64 1.0.1-13.el8 rhel-baseos 33 k libnftnl x86_64 1.1.5-5.el8 rhel-baseos 83 k libslirp x86_64 4.4.0-1.module+el8.6.0+14673+621cb8be rhel-appstream 70 k nftables x86_64 1:0.9.3-25.el8 rhel-baseos 324 k podman-catatonit x86_64 2:4.0.2-6.module+el8.6.0+14673+621cb8be rhel-appstream 354 k protobuf-c x86_64 1.3.0-6.el8 rhel-appstream 37 k runc x86_64 1:1.0.3-2.module+el8.6.0+14673+621cb8be rhel-appstream 3.0 M shadow-utils-subid x86_64 2:4.6-16.el8 rhel-baseos 112 k slirp4netns x86_64 1.1.8-2.module+el8.6.0+14673+621cb8be rhel-appstream 51 k Enabling module streams: container-tools rhel8 Transaction Summary ================================================================================================================================================= Install 23 Packages Total download size: 37 M Installed size: 128 M Downloading Packages: (1/23): conmon-2.1.0-1.module+el8.6.0+14673+621 2.0 MB/s | 55 kB 00:00 (2/23): container-selinux-2.179.1-1.module+el8. 1.3 MB/s | 58 kB 00:00 (3/23): libnet-1.1.6-15.el8.x86_64.rpm 1.1 MB/s | 66 kB 00:00 (4/23): containers-common-1-27.module+el8.6.0+1 1.8 MB/s | 95 kB 00:00 (5/23): fuse-overlayfs-1.8.2-1.module+el8.6.0+1 2.0 MB/s | 73 kB 00:00 (6/23): criu-3.15-3.module+el8.6.0+14673+621cb8 5.8 MB/s | 518 kB 00:00 (7/23): libslirp-4.4.0-1.module+el8.6.0+14673+6 2.1 MB/s | 70 kB 00:00 (8/23): podman-catatonit-4.0.2-6.module+el8.6.0 6.8 MB/s | 354 kB 00:00 (9/23): protobuf-c-1.3.0-6.el8.x86_64.rpm 7.1 MB/s | 37 kB 00:00 (10/23): runc-1.0.3-2.module+el8.6.0+14673+621c 15 MB/s | 3.0 MB 00:00 (11/23): slirp4netns-1.1.8-2.module+el8.6.0+146 1.3 MB/s | 51 kB 00:00 (12/23): fuse-common-3.3.0-15.el8.x86_64.rpm 2.2 MB/s | 22 kB 00:00 (13/23): fuse3-3.3.0-15.el8.x86_64.rpm 3.4 MB/s | 54 kB 00:00 (14/23): podman-4.0.2-6.module+el8.6.0+14673+62 32 MB/s | 13 MB 00:00 (15/23): fuse3-libs-3.3.0-15.el8.x86_64.rpm 1.2 MB/s | 95 kB 00:00 (16/23): iptables-libs-1.8.4-22.el8.x86_64.rpm 13 MB/s | 108 kB 00:00 (17/23): libnetfilter_conntrack-1.0.6-5.el8.x86 9.0 MB/s | 65 kB 00:00 (18/23): libnfnetlink-1.0.1-13.el8.x86_64.rpm 6.2 MB/s | 33 kB 00:00 (19/23): containernetworking-plugins-1.0.1-2.mo 29 MB/s | 18 MB 00:00 (20/23): iptables-1.8.4-22.el8.x86_64.rpm 5.5 MB/s | 585 kB 00:00 (21/23): libnftnl-1.1.5-5.el8.x86_64.rpm 1.0 MB/s | 83 kB 00:00 (22/23): nftables-0.9.3-25.el8.x86_64.rpm 14 MB/s | 324 kB 00:00 (23/23): shadow-utils-subid-4.6-16.el8.x86_64.r 5.4 MB/s | 112 kB 00:00 -------------------------------------------------------------------------------- Total 54 MB/s | 37 MB 00:00 Copr repo for podman-next owned by rhcontainerb 16 kB/s | 1.0 kB 00:00 Key imported successfully Running transaction check Transaction check succeeded. Running transaction test Transaction test succeeded. Running transaction Preparing : 1/1 Installing : libnftnl-1.1.5-5.el8.x86_64 1/23 Running scriptlet: libnftnl-1.1.5-5.el8.x86_64 1/23 Installing : libnfnetlink-1.0.1-13.el8.x86_64 2/23 Running scriptlet: libnfnetlink-1.0.1-13.el8.x86_64 2/23 Installing : iptables-libs-1.8.4-22.el8.x86_64 3/23 Installing : nftables-1:0.9.3-25.el8.x86_64 4/23 Running scriptlet: nftables-1:0.9.3-25.el8.x86_64 4/23 Installing : fuse3-libs-3.3.0-15.el8.x86_64 5/23 Running scriptlet: fuse3-libs-3.3.0-15.el8.x86_64 5/23 Running scriptlet: container-selinux-2:2.179.1-1.module+el8.6.0+14673 6/23 Installing : container-selinux-2:2.179.1-1.module+el8.6.0+14673 6/23 Running scriptlet: container-selinux-2:2.179.1-1.module+el8.6.0+14673 6/23 Installing : libnetfilter_conntrack-1.0.6-5.el8.x86_64 7/23 Running scriptlet: libnetfilter_conntrack-1.0.6-5.el8.x86_64 7/23 Running scriptlet: iptables-1.8.4-22.el8.x86_64 8/23 Installing : iptables-1.8.4-22.el8.x86_64 8/23 Running scriptlet: iptables-1.8.4-22.el8.x86_64 8/23 Installing : shadow-utils-subid-2:4.6-16.el8.x86_64 9/23 Installing : fuse-common-3.3.0-15.el8.x86_64 10/23 Installing : fuse3-3.3.0-15.el8.x86_64 11/23 Installing : fuse-overlayfs-1.8.2-1.module+el8.6.0+14673+621cb8 12/23 Running scriptlet: fuse-overlayfs-1.8.2-1.module+el8.6.0+14673+621cb8 12/23 Installing : protobuf-c-1.3.0-6.el8.x86_64 13/23 Installing : libslirp-4.4.0-1.module+el8.6.0+14673+621cb8be.x86 14/23 Installing : slirp4netns-1.1.8-2.module+el8.6.0+14673+621cb8be. 15/23 Installing : containernetworking-plugins-1:1.0.1-2.module+el8.6 16/23 Installing : conmon-2:2.1.0-1.module+el8.6.0+14673+621cb8be.x86 17/23 Installing : libnet-1.1.6-15.el8.x86_64 18/23 Running scriptlet: libnet-1.1.6-15.el8.x86_64 18/23 Installing : criu-3.15-3.module+el8.6.0+14673+621cb8be.x86_64 19/23 Installing : runc-1:1.0.3-2.module+el8.6.0+14673+621cb8be.x86_6 20/23 Installing : containers-common-2:1-27.module+el8.6.0+14673+621c 21/23 Installing : podman-catatonit-2:4.0.2-6.module+el8.6.0+14673+62 22/23 Installing : podman-2:4.0.2-6.module+el8.6.0+14673+621cb8be.x86 23/23 Running scriptlet: container-selinux-2:2.179.1-1.module+el8.6.0+14673 23/23 Running scriptlet: podman-2:4.0.2-6.module+el8.6.0+14673+621cb8be.x86 23/23 Verifying : libnet-1.1.6-15.el8.x86_64 1/23 Verifying : conmon-2:2.1.0-1.module+el8.6.0+14673+621cb8be.x86 2/23 Verifying : container-selinux-2:2.179.1-1.module+el8.6.0+14673 3/23 Verifying : containernetworking-plugins-1:1.0.1-2.module+el8.6 4/23 Verifying : containers-common-2:1-27.module+el8.6.0+14673+621c 5/23 Verifying : criu-3.15-3.module+el8.6.0+14673+621cb8be.x86_64 6/23 Verifying : fuse-overlayfs-1.8.2-1.module+el8.6.0+14673+621cb8 7/23 Verifying : libslirp-4.4.0-1.module+el8.6.0+14673+621cb8be.x86 8/23 Verifying : podman-2:4.0.2-6.module+el8.6.0+14673+621cb8be.x86 9/23 Verifying : podman-catatonit-2:4.0.2-6.module+el8.6.0+14673+62 10/23 Verifying : protobuf-c-1.3.0-6.el8.x86_64 11/23 Verifying : runc-1:1.0.3-2.module+el8.6.0+14673+621cb8be.x86_6 12/23 Verifying : slirp4netns-1.1.8-2.module+el8.6.0+14673+621cb8be. 13/23 Verifying : fuse-common-3.3.0-15.el8.x86_64 14/23 Verifying : fuse3-3.3.0-15.el8.x86_64 15/23 Verifying : fuse3-libs-3.3.0-15.el8.x86_64 16/23 Verifying : iptables-1.8.4-22.el8.x86_64 17/23 Verifying : iptables-libs-1.8.4-22.el8.x86_64 18/23 Verifying : libnetfilter_conntrack-1.0.6-5.el8.x86_64 19/23 Verifying : libnfnetlink-1.0.1-13.el8.x86_64 20/23 Verifying : libnftnl-1.1.5-5.el8.x86_64 21/23 Verifying : nftables-1:0.9.3-25.el8.x86_64 22/23 Verifying : shadow-utils-subid-2:4.6-16.el8.x86_64 23/23 Installed: conmon-2:2.1.0-1.module+el8.6.0+14673+621cb8be.x86_64 container-selinux-2:2.179.1-1.module+el8.6.0+14673+621cb8be.noarch containernetworking-plugins-1:1.0.1-2.module+el8.6.0+14673+621cb8be.x86_64 containers-common-2:1-27.module+el8.6.0+14673+621cb8be.x86_64 criu-3.15-3.module+el8.6.0+14673+621cb8be.x86_64 fuse-common-3.3.0-15.el8.x86_64 fuse-overlayfs-1.8.2-1.module+el8.6.0+14673+621cb8be.x86_64 fuse3-3.3.0-15.el8.x86_64 fuse3-libs-3.3.0-15.el8.x86_64 iptables-1.8.4-22.el8.x86_64 iptables-libs-1.8.4-22.el8.x86_64 libnet-1.1.6-15.el8.x86_64 libnetfilter_conntrack-1.0.6-5.el8.x86_64 libnfnetlink-1.0.1-13.el8.x86_64 libnftnl-1.1.5-5.el8.x86_64 libslirp-4.4.0-1.module+el8.6.0+14673+621cb8be.x86_64 nftables-1:0.9.3-25.el8.x86_64 podman-2:4.0.2-6.module+el8.6.0+14673+621cb8be.x86_64 podman-catatonit-2:4.0.2-6.module+el8.6.0+14673+621cb8be.x86_64 protobuf-c-1.3.0-6.el8.x86_64 runc-1:1.0.3-2.module+el8.6.0+14673+621cb8be.x86_64 shadow-utils-subid-2:4.6-16.el8.x86_64 slirp4netns-1.1.8-2.module+el8.6.0+14673+621cb8be.x86_64 Complete! STDERR: Importing GPG key 0xD87DEB39: Userid : "rhcontainerbot_podman-next (None) " Fingerprint: 4937 B714 A16A 535B F4B3 B018 8E54 4399 D87D EB39 From : https://download.copr.fedorainfracloud.org/results/rhcontainerbot/podman-next/pubkey.gpg TASK [Podman version] ********************************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:142 Wednesday 27 July 2022 21:13:51 +0000 (0:00:39.989) 0:00:45.204 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "cmd": [ "podman", "--version" ], "delta": "0:00:00.065388", "end": "2022-07-27 17:13:50.562901", "rc": 0, "start": "2022-07-27 17:13:50.497513" } STDOUT: podman version 4.0.2 TASK [Create user] ************************************************************* task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:145 Wednesday 27 July 2022 21:13:51 +0000 (0:00:00.485) 0:00:45.690 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "comment": "", "create_home": true, "group": 1001, "home": "/home/user1", "name": "user1", "shell": "/bin/bash", "state": "present", "system": false, "uid": 1001 } TASK [Create tempfile for kube_src] ******************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:149 Wednesday 27 July 2022 21:13:52 +0000 (0:00:00.908) 0:00:46.598 ******** changed: [/cache/rhel-8.qcow2 -> localhost] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_podman_dkhpy8i1.yml", "size": 0, "state": "file", "uid": 0 } TASK [Write kube_file_src] ***************************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:157 Wednesday 27 July 2022 21:13:53 +0000 (0:00:00.331) 0:00:46.929 ******** changed: [/cache/rhel-8.qcow2 -> localhost] => { "changed": true, "checksum": "7c999d33fe2b60b3c65ec0a85b8924cc4e970d83", "dest": "/tmp/lsr_podman_dkhpy8i1.yml", "gid": 0, "group": "root", "md5sum": "4807f59df21780236988d1eb89142aa2", "mode": "0600", "owner": "root", "size": 665, "src": "/root/.ansible/tmp/ansible-tmp-1658956433.0963998-88406-112724133290132/source", "state": "file", "uid": 0 } TASK [Create host directories for data] **************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:163 Wednesday 27 July 2022 21:13:53 +0000 (0:00:00.616) 0:00:47.546 ******** changed: [/cache/rhel-8.qcow2] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "mode": "0755", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 1001 } changed: [/cache/rhel-8.qcow2] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd2", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0 } changed: [/cache/rhel-8.qcow2] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd3", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [Create data files] ******************************************************* task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:171 Wednesday 27 July 2022 21:13:54 +0000 (0:00:01.239) 0:00:48.785 ******** changed: [/cache/rhel-8.qcow2] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd1/index.txt", "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "user1", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956435.015736-88471-80356361322103/source", "state": "file", "uid": 1001 } changed: [/cache/rhel-8.qcow2] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd2/index.txt", "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956435.6908922-88471-12498318929089/source", "state": "file", "uid": 0 } changed: [/cache/rhel-8.qcow2] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd3/index.txt", "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956436.351825-88471-97915203110306/source", "state": "file", "uid": 0 } TASK [Run role] **************************************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:179 Wednesday 27 July 2022 21:13:56 +0000 (0:00:02.055) 0:00:50.841 ******** TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Wednesday 27 July 2022 21:13:57 +0000 (0:00:00.039) 0:00:50.881 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Wednesday 27 July 2022 21:13:57 +0000 (0:00:00.031) 0:00:50.912 ******** ok: [/cache/rhel-8.qcow2] TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:8 Wednesday 27 July 2022 21:13:57 +0000 (0:00:00.524) 0:00:51.436 ******** skipping: [/cache/rhel-8.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8.qcow2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8.qcow2] => (item=RedHat_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/rhel-8.qcow2] => (item=RedHat_8.6.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat_8.6.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Wednesday 27 July 2022 21:13:57 +0000 (0:00:00.040) 0:00:51.477 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:11 Wednesday 27 July 2022 21:13:59 +0000 (0:00:01.544) 0:00:53.022 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - system] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:3 Wednesday 27 July 2022 21:13:59 +0000 (0:00:00.031) 0:00:53.053 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/containers.conf.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - user] **** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:12 Wednesday 27 July 2022 21:13:59 +0000 (0:00:00.384) 0:00:53.438 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system container config file] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:21 Wednesday 27 July 2022 21:13:59 +0000 (0:00:00.022) 0:00:53.460 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "checksum": "c56b033cc4634627f8794d6ccf07a051e0820c07", "dest": "/etc/containers/containers.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "f25228df7b38eaff9b4f63b1b39baa1c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 94, "src": "/root/.ansible/tmp/ansible-tmp-1658956439.660057-88607-66433915786883/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user container config file] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:32 Wednesday 27 July 2022 21:14:00 +0000 (0:00:00.704) 0:00:54.165 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Wednesday 27 July 2022 21:14:00 +0000 (0:00:00.021) 0:00:54.186 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - system] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:3 Wednesday 27 July 2022 21:14:00 +0000 (0:00:00.032) 0:00:54.219 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/registries.conf.d", "secontext": "system_u:object_r:etc_t:s0", "size": 107, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - user] **** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:12 Wednesday 27 July 2022 21:14:00 +0000 (0:00:00.395) 0:00:54.614 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system registries config file] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:21 Wednesday 27 July 2022 21:14:00 +0000 (0:00:00.020) 0:00:54.635 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "checksum": "15062ec12da5e642a2b0fb64c5e03d43b80d9cf0", "dest": "/etc/containers/registries.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "88be21c8634b01869b9f694831b84c1d", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 22, "src": "/root/.ansible/tmp/ansible-tmp-1658956440.8358333-88657-241415948329867/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user registries config file] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:32 Wednesday 27 July 2022 21:14:01 +0000 (0:00:00.697) 0:00:55.333 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:17 Wednesday 27 July 2022 21:14:01 +0000 (0:00:00.021) 0:00:55.355 ******** TASK [fedora.linux_system_roles.firewall : include_tasks] ********************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:1 Wednesday 27 July 2022 21:14:01 +0000 (0:00:00.061) 0:00:55.416 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Wednesday 27 July 2022 21:14:01 +0000 (0:00:00.031) 0:00:55.448 ******** ok: [/cache/rhel-8.qcow2] TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 Wednesday 27 July 2022 21:14:02 +0000 (0:00:00.516) 0:00:55.964 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "rc": 0, "results": [ "Installed: python3-slip-0.6.4-11.el8.noarch", "Installed: python3-slip-dbus-0.6.4-11.el8.noarch", "Installed: ipset-7.1-1.el8.x86_64", "Installed: python3-nftables-1:0.9.3-25.el8.x86_64", "Installed: ipset-libs-7.1-1.el8.x86_64", "Installed: python3-firewall-0.9.3-13.el8.noarch", "Installed: iptables-ebtables-1.8.4-22.el8.x86_64", "Installed: firewalld-0.9.3-13.el8.noarch", "Installed: firewalld-filesystem-0.9.3-13.el8.noarch" ] } TASK [fedora.linux_system_roles.firewall : Install python-firewall] ************ task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:12 Wednesday 27 July 2022 21:14:04 +0000 (0:00:02.447) 0:00:58.412 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Install python3-firewall] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 Wednesday 27 July 2022 21:14:04 +0000 (0:00:00.030) 0:00:58.442 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 Wednesday 27 July 2022 21:14:05 +0000 (0:00:01.258) 0:00:59.701 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "enabled": true, "name": "firewalld", "state": "started", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dbus.service polkit.service sysinit.target basic.target dbus.socket system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target ebtables.service iptables.service ip6tables.service nftables.service ipset.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "man:firewalld(1)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "6971", "LimitNPROCSoft": "6971", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "6971", "LimitSIGPENDINGSoft": "6971", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "firewalld.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target system.slice dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "11153", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Wednesday 27 July 2022 21:14:06 +0000 (0:00:01.026) 0:01:00.727 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:14 Wednesday 27 July 2022 21:14:06 +0000 (0:00:00.077) 0:01:00.805 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:36 Wednesday 27 July 2022 21:14:06 +0000 (0:00:00.035) 0:01:00.840 ******** changed: [/cache/rhel-8.qcow2] => (item={'port': '8080-8082/tcp', 'state': 'enabled'}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "port": "8080-8082/tcp", "state": "enabled" } } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:66 Wednesday 27 July 2022 21:14:07 +0000 (0:00:00.822) 0:01:01.662 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:74 Wednesday 27 July 2022 21:14:07 +0000 (0:00:00.058) 0:01:01.721 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:80 Wednesday 27 July 2022 21:14:07 +0000 (0:00:00.052) 0:01:01.773 ******** skipping: [/cache/rhel-8.qcow2] => {} META: role_complete for /cache/rhel-8.qcow2 TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:24 Wednesday 27 July 2022 21:14:07 +0000 (0:00:00.057) 0:01:01.831 ******** redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.seboolean to ansible.posix.seboolean redirecting (type: modules) ansible.builtin.sefcontext to community.general.sefcontext redirecting (type: modules) community.general.sefcontext to community.general.system.sefcontext redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport redirecting (type: modules) ansible.builtin.selogin to community.general.selogin redirecting (type: modules) community.general.selogin to community.general.system.selogin TASK [fedora.linux_system_roles.selinux : Set ansible_facts required by role and install packages] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:2 Wednesday 27 July 2022 21:14:08 +0000 (0:00:00.141) 0:01:01.972 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.selinux : Ensure ansible_facts used by role] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:2 Wednesday 27 July 2022 21:14:08 +0000 (0:00:00.036) 0:01:02.008 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python2 tools] ******* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:7 Wednesday 27 July 2022 21:14:08 +0000 (0:00:00.037) 0:01:02.045 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python3 tools] ******* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 Wednesday 27 July 2022 21:14:08 +0000 (0:00:00.025) 0:01:02.071 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.selinux : refresh facts] *********************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 Wednesday 27 July 2022 21:14:09 +0000 (0:00:01.244) 0:01:03.316 ******** ok: [/cache/rhel-8.qcow2] TASK [fedora.linux_system_roles.selinux : Install SELinux tool semanage] ******* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 Wednesday 27 July 2022 21:14:10 +0000 (0:00:00.796) 0:01:04.112 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if enabled] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:5 Wednesday 27 July 2022 21:14:11 +0000 (0:00:01.272) 0:01:05.385 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if disabled] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:12 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.023) 0:01:05.408 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set selinux_reboot_required] ********* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:19 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.024) 0:01:05.433 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "selinux_reboot_required": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Fail if reboot is required] ********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:23 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.051) 0:01:05.484 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Warn if SELinux is disabled] ********* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:28 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.023) 0:01:05.508 ******** skipping: [/cache/rhel-8.qcow2] => {} TASK [fedora.linux_system_roles.selinux : Drop all local modifications] ******** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:33 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.021) 0:01:05.530 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux boolean local modifications] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:40 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.035) 0:01:05.565 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux file context local modifications] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:44 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.033) 0:01:05.599 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux port local modifications] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:48 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.032) 0:01:05.632 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux login local modifications] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:52 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.032) 0:01:05.665 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set SELinux booleans] **************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:56 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.035) 0:01:05.701 ******** TASK [fedora.linux_system_roles.selinux : Set SELinux file contexts] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:63 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.020) 0:01:05.721 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:72 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.020) 0:01:05.742 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree in check mode] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:78 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.021) 0:01:05.764 ******** TASK [fedora.linux_system_roles.selinux : Set an SELinux label on a port] ****** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 Wednesday 27 July 2022 21:14:11 +0000 (0:00:00.020) 0:01:05.785 ******** redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport changed: [/cache/rhel-8.qcow2] => (item={'ports': '8080-8082', 'setype': 'http_port_t'}) => { "ansible_loop_var": "item", "changed": true, "item": { "ports": "8080-8082", "setype": "http_port_t" }, "ports": [ "8080-8082" ], "proto": "tcp", "setype": "http_port_t", "state": "present" } TASK [fedora.linux_system_roles.selinux : Set linux user to SELinux user mapping] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:94 Wednesday 27 July 2022 21:14:14 +0000 (0:00:02.311) 0:01:08.097 ******** TASK [fedora.linux_system_roles.selinux : Get SELinux modules facts] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:104 Wednesday 27 July 2022 21:14:14 +0000 (0:00:00.026) 0:01:08.123 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "selinux_installed_modules": { "abrt": { "100": "enabled" }, "accountsd": { "100": "enabled" }, "acct": { "100": "enabled" }, "afs": { "100": "enabled" }, "aiccu": { "100": "enabled" }, "aide": { "100": "enabled" }, "ajaxterm": { "100": "enabled" }, "alsa": { "100": "enabled" }, "amanda": { "100": "enabled" }, "amtu": { "100": "enabled" }, "anaconda": { "100": "enabled" }, "antivirus": { "100": "enabled" }, "apache": { "100": "enabled" }, "apcupsd": { "100": "enabled" }, "apm": { "100": "enabled" }, "application": { "100": "enabled" }, "arpwatch": { "100": "enabled" }, "asterisk": { "100": "enabled" }, "auditadm": { "100": "enabled" }, "authconfig": { "100": "enabled" }, "authlogin": { "100": "enabled" }, "automount": { "100": "enabled" }, "avahi": { "100": "enabled" }, "awstats": { "100": "enabled" }, "bacula": { "100": "enabled" }, "base": { "100": "enabled" }, "bcfg2": { "100": "enabled" }, "bind": { "100": "enabled" }, "bitlbee": { "100": "enabled" }, "blkmapd": { "100": "enabled" }, "blueman": { "100": "enabled" }, "bluetooth": { "100": "enabled" }, "boinc": { "100": "enabled" }, "boltd": { "100": "enabled" }, "bootloader": { "100": "enabled" }, "brctl": { "100": "enabled" }, "brltty": { "100": "enabled" }, "bugzilla": { "100": "enabled" }, "bumblebee": { "100": "enabled" }, "cachefilesd": { "100": "enabled" }, "calamaris": { "100": "enabled" }, "callweaver": { "100": "enabled" }, "canna": { "100": "enabled" }, "ccs": { "100": "enabled" }, "cdrecord": { "100": "enabled" }, "certmaster": { "100": "enabled" }, "certmonger": { "100": "enabled" }, "certwatch": { "100": "enabled" }, "cfengine": { "100": "enabled" }, "cgdcbxd": { "100": "enabled" }, "cgroup": { "100": "enabled" }, "chrome": { "100": "enabled" }, "chronyd": { "100": "enabled" }, "cinder": { "100": "enabled" }, "cipe": { "100": "enabled" }, "clock": { "100": "enabled" }, "clogd": { "100": "enabled" }, "cloudform": { "100": "enabled" }, "cmirrord": { "100": "enabled" }, "cobbler": { "100": "enabled" }, "cockpit": { "100": "enabled", "200": "enabled" }, "collectd": { "100": "enabled" }, "colord": { "100": "enabled" }, "comsat": { "100": "enabled" }, "condor": { "100": "enabled" }, "conman": { "100": "enabled" }, "conntrackd": { "100": "enabled" }, "consolekit": { "100": "enabled" }, "container": { "200": "enabled" }, "couchdb": { "100": "enabled" }, "courier": { "100": "enabled" }, "cpucontrol": { "100": "enabled" }, "cpufreqselector": { "100": "enabled" }, "cpuplug": { "100": "enabled" }, "cron": { "100": "enabled" }, "ctdb": { "100": "enabled" }, "cups": { "100": "enabled" }, "cvs": { "100": "enabled" }, "cyphesis": { "100": "enabled" }, "cyrus": { "100": "enabled" }, "daemontools": { "100": "enabled" }, "dbadm": { "100": "enabled" }, "dbskk": { "100": "enabled" }, "dbus": { "100": "enabled" }, "dcc": { "100": "enabled" }, "ddclient": { "100": "enabled" }, "denyhosts": { "100": "enabled" }, "devicekit": { "100": "enabled" }, "dhcp": { "100": "enabled" }, "dictd": { "100": "enabled" }, "dirsrv": { "100": "enabled" }, "dirsrv-admin": { "100": "enabled" }, "dmesg": { "100": "enabled" }, "dmidecode": { "100": "enabled" }, "dnsmasq": { "100": "enabled" }, "dnssec": { "100": "enabled" }, "dovecot": { "100": "enabled" }, "drbd": { "100": "enabled" }, "dspam": { "100": "enabled" }, "entropyd": { "100": "enabled" }, "exim": { "100": "enabled" }, "fail2ban": { "100": "enabled" }, "fcoe": { "100": "enabled" }, "fetchmail": { "100": "enabled" }, "finger": { "100": "enabled" }, "firewalld": { "100": "enabled" }, "firewallgui": { "100": "enabled" }, "firstboot": { "100": "enabled" }, "fprintd": { "100": "enabled" }, "freeipmi": { "100": "enabled" }, "freqset": { "100": "enabled" }, "fstools": { "100": "enabled" }, "ftp": { "100": "enabled" }, "fwupd": { "100": "enabled" }, "games": { "100": "enabled" }, "gdomap": { "100": "enabled" }, "geoclue": { "100": "enabled" }, "getty": { "100": "enabled" }, "git": { "100": "enabled" }, "gitosis": { "100": "enabled" }, "glance": { "100": "enabled" }, "gnome": { "100": "enabled" }, "gpg": { "100": "enabled" }, "gpm": { "100": "enabled" }, "gpsd": { "100": "enabled" }, "gssproxy": { "100": "enabled" }, "guest": { "100": "enabled" }, "hddtemp": { "100": "enabled" }, "hostapd": { "100": "enabled" }, "hostname": { "100": "enabled" }, "hsqldb": { "100": "enabled" }, "hwloc": { "100": "enabled" }, "hypervkvp": { "100": "enabled" }, "ibacm": { "100": "enabled" }, "icecast": { "100": "enabled" }, "inetd": { "100": "enabled" }, "init": { "100": "enabled" }, "inn": { "100": "enabled" }, "insights_client": { "100": "enabled" }, "iodine": { "100": "enabled" }, "iotop": { "100": "enabled" }, "ipmievd": { "100": "enabled" }, "ipsec": { "100": "enabled" }, "iptables": { "100": "enabled" }, "irc": { "100": "enabled" }, "irqbalance": { "100": "enabled" }, "iscsi": { "100": "enabled" }, "isns": { "100": "enabled" }, "jabber": { "100": "enabled" }, "jetty": { "100": "enabled" }, "jockey": { "100": "enabled" }, "journalctl": { "100": "enabled" }, "kdbus": { "100": "enabled" }, "kdump": { "100": "enabled" }, "kdumpgui": { "100": "enabled" }, "keepalived": { "100": "enabled" }, "kerberos": { "100": "enabled" }, "keyboardd": { "100": "enabled" }, "keystone": { "100": "enabled" }, "kismet": { "100": "enabled" }, "kmscon": { "100": "enabled" }, "kpatch": { "100": "enabled" }, "ksmtuned": { "100": "enabled" }, "ktalk": { "100": "enabled" }, "l2tp": { "100": "enabled" }, "ldap": { "100": "enabled" }, "libraries": { "100": "enabled" }, "likewise": { "100": "enabled" }, "linuxptp": { "100": "enabled" }, "lircd": { "100": "enabled" }, "livecd": { "100": "enabled" }, "lldpad": { "100": "enabled" }, "loadkeys": { "100": "enabled" }, "locallogin": { "100": "enabled" }, "lockdev": { "100": "enabled" }, "logadm": { "100": "enabled" }, "logging": { "100": "enabled" }, "logrotate": { "100": "enabled" }, "logwatch": { "100": "enabled" }, "lpd": { "100": "enabled" }, "lsm": { "100": "enabled" }, "lttng-tools": { "100": "enabled" }, "lvm": { "100": "enabled" }, "mailman": { "100": "enabled" }, "mailscanner": { "100": "enabled" }, "man2html": { "100": "enabled" }, "mandb": { "100": "enabled" }, "mcelog": { "100": "enabled" }, "mediawiki": { "100": "enabled" }, "memcached": { "100": "enabled" }, "milter": { "100": "enabled" }, "minidlna": { "100": "enabled" }, "minissdpd": { "100": "enabled" }, "mip6d": { "100": "enabled" }, "mirrormanager": { "100": "enabled" }, "miscfiles": { "100": "enabled" }, "mock": { "100": "enabled" }, "modemmanager": { "100": "enabled" }, "modutils": { "100": "enabled" }, "mojomojo": { "100": "enabled" }, "mon_statd": { "100": "enabled" }, "mongodb": { "100": "enabled" }, "motion": { "100": "enabled" }, "mount": { "100": "enabled" }, "mozilla": { "100": "enabled" }, "mpd": { "100": "enabled" }, "mplayer": { "100": "enabled" }, "mrtg": { "100": "enabled" }, "mta": { "100": "enabled" }, "munin": { "100": "enabled" }, "mysql": { "100": "enabled" }, "mythtv": { "100": "enabled" }, "naemon": { "100": "enabled" }, "nagios": { "100": "enabled" }, "namespace": { "100": "enabled" }, "ncftool": { "100": "enabled" }, "netlabel": { "100": "enabled" }, "netutils": { "100": "enabled" }, "networkmanager": { "100": "enabled" }, "ninfod": { "100": "enabled" }, "nis": { "100": "enabled" }, "nova": { "100": "enabled" }, "nscd": { "100": "enabled" }, "nsd": { "100": "enabled" }, "nslcd": { "100": "enabled" }, "ntop": { "100": "enabled" }, "ntp": { "100": "enabled" }, "numad": { "100": "enabled" }, "nut": { "100": "enabled" }, "nx": { "100": "enabled" }, "obex": { "100": "enabled" }, "oddjob": { "100": "enabled" }, "opafm": { "100": "enabled" }, "openct": { "100": "enabled" }, "opendnssec": { "100": "enabled" }, "openfortivpn": { "100": "enabled" }, "openhpid": { "100": "enabled" }, "openshift": { "100": "enabled" }, "openshift-origin": { "100": "enabled" }, "opensm": { "100": "enabled" }, "openvpn": { "100": "enabled" }, "openvswitch": { "100": "enabled" }, "openwsman": { "100": "enabled" }, "oracleasm": { "100": "enabled" }, "osad": { "100": "enabled" }, "pads": { "100": "enabled" }, "passenger": { "100": "enabled" }, "pcmcia": { "100": "enabled" }, "pcp": { "100": "enabled" }, "pcscd": { "100": "enabled" }, "pdns": { "100": "enabled" }, "pegasus": { "100": "enabled" }, "permissivedomains": { "100": "enabled" }, "pesign": { "100": "enabled" }, "pingd": { "100": "enabled" }, "piranha": { "100": "enabled" }, "pkcs": { "100": "enabled" }, "pkcs11proxyd": { "100": "enabled" }, "pki": { "100": "enabled" }, "plymouthd": { "100": "enabled" }, "podsleuth": { "100": "enabled" }, "policykit": { "100": "enabled" }, "polipo": { "100": "enabled" }, "portmap": { "100": "enabled" }, "portreserve": { "100": "enabled" }, "postfix": { "100": "enabled" }, "postgresql": { "100": "enabled" }, "postgrey": { "100": "enabled" }, "ppp": { "100": "enabled" }, "prelink": { "100": "enabled" }, "prelude": { "100": "enabled" }, "privoxy": { "100": "enabled" }, "procmail": { "100": "enabled" }, "prosody": { "100": "enabled" }, "psad": { "100": "enabled" }, "ptchown": { "100": "enabled" }, "publicfile": { "100": "enabled" }, "pulseaudio": { "100": "enabled" }, "puppet": { "100": "enabled" }, "pwauth": { "100": "enabled" }, "qmail": { "100": "enabled" }, "qpid": { "100": "enabled" }, "quantum": { "100": "enabled" }, "quota": { "100": "enabled" }, "rabbitmq": { "100": "enabled" }, "radius": { "100": "enabled" }, "radvd": { "100": "enabled" }, "raid": { "100": "enabled" }, "rasdaemon": { "100": "enabled" }, "rdisc": { "100": "enabled" }, "readahead": { "100": "enabled" }, "realmd": { "100": "enabled" }, "redis": { "100": "enabled" }, "remotelogin": { "100": "enabled" }, "rhcs": { "100": "enabled" }, "rhev": { "100": "enabled" }, "rhgb": { "100": "enabled" }, "rhnsd": { "100": "enabled" }, "rhsmcertd": { "100": "enabled" }, "ricci": { "100": "enabled" }, "rkhunter": { "100": "enabled" }, "rkt": { "100": "enabled" }, "rlogin": { "100": "enabled" }, "rngd": { "100": "enabled" }, "rolekit": { "100": "enabled" }, "roundup": { "100": "enabled" }, "rpc": { "100": "enabled" }, "rpcbind": { "100": "enabled" }, "rpm": { "100": "enabled" }, "rrdcached": { "100": "enabled" }, "rshd": { "100": "enabled" }, "rssh": { "100": "enabled" }, "rsync": { "100": "enabled" }, "rtas": { "100": "enabled" }, "rtkit": { "100": "enabled" }, "rwho": { "100": "enabled" }, "samba": { "100": "enabled" }, "sambagui": { "100": "enabled" }, "sandboxX": { "100": "enabled" }, "sanlock": { "100": "enabled" }, "sasl": { "100": "enabled" }, "sbd": { "100": "enabled" }, "sblim": { "100": "enabled" }, "screen": { "100": "enabled" }, "secadm": { "100": "enabled" }, "sectoolm": { "100": "enabled" }, "selinuxutil": { "100": "enabled" }, "sendmail": { "100": "enabled" }, "sensord": { "100": "enabled" }, "setrans": { "100": "enabled" }, "setroubleshoot": { "100": "enabled" }, "seunshare": { "100": "enabled" }, "sge": { "100": "enabled" }, "shorewall": { "100": "enabled" }, "slocate": { "100": "enabled" }, "slpd": { "100": "enabled" }, "smartmon": { "100": "enabled" }, "smokeping": { "100": "enabled" }, "smoltclient": { "100": "enabled" }, "smsd": { "100": "enabled" }, "snapper": { "100": "enabled" }, "snmp": { "100": "enabled" }, "snort": { "100": "enabled" }, "sosreport": { "100": "enabled" }, "soundserver": { "100": "enabled" }, "spamassassin": { "100": "enabled" }, "speech-dispatcher": { "100": "enabled" }, "squid": { "100": "enabled" }, "ssh": { "100": "enabled" }, "sslh": { "100": "enabled" }, "sssd": { "100": "enabled" }, "staff": { "100": "enabled" }, "stapserver": { "100": "enabled" }, "stratisd": { "100": "enabled" }, "stunnel": { "100": "enabled" }, "su": { "100": "enabled" }, "sudo": { "100": "enabled" }, "svnserve": { "100": "enabled" }, "swift": { "100": "enabled" }, "sysadm": { "100": "enabled" }, "sysadm_secadm": { "100": "enabled" }, "sysnetwork": { "100": "enabled" }, "sysstat": { "100": "enabled" }, "systemd": { "100": "enabled" }, "tangd": { "100": "enabled" }, "targetd": { "100": "enabled" }, "tcpd": { "100": "enabled" }, "tcsd": { "100": "enabled" }, "telepathy": { "100": "enabled" }, "telnet": { "100": "enabled" }, "tftp": { "100": "enabled" }, "tgtd": { "100": "enabled" }, "thin": { "100": "enabled" }, "thumb": { "100": "enabled" }, "timedatex": { "100": "enabled" }, "tlp": { "100": "enabled" }, "tmpreaper": { "100": "enabled" }, "tomcat": { "100": "enabled" }, "tor": { "100": "enabled" }, "tuned": { "100": "enabled" }, "tvtime": { "100": "enabled" }, "udev": { "100": "enabled" }, "ulogd": { "100": "enabled" }, "uml": { "100": "enabled" }, "unconfined": { "100": "enabled" }, "unconfineduser": { "100": "enabled" }, "unlabelednet": { "100": "enabled" }, "unprivuser": { "100": "enabled" }, "updfstab": { "100": "enabled" }, "usbmodules": { "100": "enabled" }, "usbmuxd": { "100": "enabled" }, "userdomain": { "100": "enabled" }, "userhelper": { "100": "enabled" }, "usermanage": { "100": "enabled" }, "usernetctl": { "100": "enabled" }, "uucp": { "100": "enabled" }, "uuidd": { "100": "enabled" }, "varnishd": { "100": "enabled" }, "vdagent": { "100": "enabled" }, "vhostmd": { "100": "enabled" }, "virt": { "100": "enabled" }, "vlock": { "100": "enabled" }, "vmtools": { "100": "enabled" }, "vmware": { "100": "enabled" }, "vnstatd": { "100": "enabled" }, "vpn": { "100": "enabled" }, "w3c": { "100": "enabled" }, "watchdog": { "100": "enabled" }, "wdmd": { "100": "enabled" }, "webadm": { "100": "enabled" }, "webalizer": { "100": "enabled" }, "wine": { "100": "enabled" }, "wireshark": { "100": "enabled" }, "xen": { "100": "enabled" }, "xguest": { "100": "enabled" }, "xserver": { "100": "enabled" }, "zabbix": { "100": "enabled" }, "zarafa": { "100": "enabled" }, "zebra": { "100": "enabled" }, "zoneminder": { "100": "enabled" }, "zosremote": { "100": "enabled" } }, "selinux_priorities": true }, "changed": false } TASK [fedora.linux_system_roles.selinux : include_tasks] *********************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:107 Wednesday 27 July 2022 21:14:14 +0000 (0:00:00.542) 0:01:08.665 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } META: role_complete for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:31 Wednesday 27 July 2022 21:14:14 +0000 (0:00:00.053) 0:01:08.718 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8.qcow2 => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'run_as_user': 'user1', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd1'}, 'spec': {'containers': [{'name': 'httpd1', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8080}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd1'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd1-create'}}]}}}) included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8.qcow2 => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd2'}, 'spec': {'containers': [{'name': 'httpd2', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8081}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd2'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd2-create'}}]}}}) included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/rhel-8.qcow2 => (item={'state': 'started', 'kube_file_src': '/tmp/lsr_podman_dkhpy8i1.yml'}) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:13 Wednesday 27 July 2022 21:14:14 +0000 (0:00:00.134) 0:01:08.853 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_kube_spec": { "debug": true, "log_level": "debug", "state": "started" }, "__podman_kube_str": "apiVersion: v1\nkind: Pod\nmetadata:\n labels:\n app: test\n io.containers.autoupdate: registry\n name: httpd1\nspec:\n containers:\n - command:\n - /bin/busybox-extras\n - httpd\n - -f\n - -p\n - 80\n image: quay.io/libpod/testimage:20210610\n name: httpd1\n ports:\n - containerPort: 80\n hostPort: 8080\n volumeMounts:\n - mountPath: /var/www:Z\n name: www\n - mountPath: /var/httpd-create:Z\n name: create\n workingDir: /var/www\n volumes:\n - hostPath:\n path: /tmp/httpd1\n name: www\n - hostPath:\n path: /tmp/httpd1-create\n name: create\n" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:25 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.056) 0:01:08.909 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_kube": { "apiVersion": "v1", "kind": "Pod", "metadata": { "labels": { "app": "test", "io.containers.autoupdate": "registry" }, "name": "httpd1" }, "spec": { "containers": [ { "command": [ "/bin/busybox-extras", "httpd", "-f", "-p", 80 ], "image": "quay.io/libpod/testimage:20210610", "name": "httpd1", "ports": [ { "containerPort": 80, "hostPort": 8080 } ], "volumeMounts": [ { "mountPath": "/var/www:Z", "name": "www" }, { "mountPath": "/var/httpd-create:Z", "name": "create" } ], "workingDir": "/var/www" } ], "volumes": [ { "hostPath": { "path": "/tmp/httpd1" }, "name": "www" }, { "hostPath": { "path": "/tmp/httpd1-create" }, "name": "create" } ] } }, "__podman_kube_file": "", "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "user1" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:35 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.056) 0:01:08.966 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_kube_name": "httpd1", "__podman_rootless": true }, "changed": false } TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:40 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.040) 0:01:09.006 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "getent_passwd": { "user1": [ "x", "1001", "1001", "", "/home/user1", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:46 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.476) 0:01:09.483 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if no kube spec is given] ******** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:53 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.028) 0:01:09.511 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:62 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.030) 0:01:09.541 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_group": "1001", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/user1", "__podman_xdg_runtime_dir": "/run/user/1001" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:78 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.058) 0:01:09.599 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_kube_path": "/home/user1/.config/containers/ansible-kubernetes.d" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:82 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.037) 0:01:09.637 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_kube_file": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get service name using systemd-escape] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:86 Wednesday 27 July 2022 21:14:15 +0000 (0:00:00.041) 0:01:09.679 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "cmd": [ "systemd-escape", "--template", "podman-kube@.service", "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" ], "delta": "0:00:00.006496", "end": "2022-07-27 17:14:14.956340", "rc": 0, "start": "2022-07-27 17:14:14.949844" } STDOUT: podman-kube@-home-user1-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service TASK [fedora.linux_system_roles.podman : Cleanup containers and services] ****** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:93 Wednesday 27 July 2022 21:14:16 +0000 (0:00:00.402) 0:01:10.082 ******** skipping: [/cache/rhel-8.qcow2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update containers and services] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:97 Wednesday 27 July 2022 21:14:16 +0000 (0:00:00.026) 0:01:10.109 ******** included: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml for /cache/rhel-8.qcow2 TASK [fedora.linux_system_roles.podman : Check if user is lingering] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:1 Wednesday 27 July 2022 21:14:16 +0000 (0:00:00.056) 0:01:10.165 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Enable lingering if needed] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:7 Wednesday 27 July 2022 21:14:16 +0000 (0:00:00.377) 0:01:10.543 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "cmd": [ "loginctl", "enable-linger", "user1" ], "delta": "0:00:00.029507", "end": "2022-07-27 17:14:15.841136", "rc": 0, "start": "2022-07-27 17:14:15.811629" } TASK [fedora.linux_system_roles.podman : Get the host mount volumes] *********** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:13 Wednesday 27 July 2022 21:14:17 +0000 (0:00:00.428) 0:01:10.972 ******** ok: [/cache/rhel-8.qcow2] => { "ansible_facts": { "__podman_volumes": [ "/tmp/httpd1", "/tmp/httpd1-create" ] }, "changed": false } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24 Wednesday 27 July 2022 21:14:17 +0000 (0:00:00.109) 0:01:11.081 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) changed: [/cache/rhel-8.qcow2] => (item=/tmp/httpd1) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 23, "state": "directory", "uid": 1001 } changed: [/cache/rhel-8.qcow2] => (item=/tmp/httpd1-create) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1-create", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1-create", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 Wednesday 27 July 2022 21:14:17 +0000 (0:00:00.768) 0:01:11.850 ******** changed: [/cache/rhel-8.qcow2] => (item=quay.io/libpod/testimage:20210610) => { "actions": [ "Pulled image quay.io/libpod/testimage:20210610" ], "ansible_loop_var": "item", "changed": true, "image": [ { "Annotations": {}, "Architecture": "amd64", "Author": "", "Comment": "", "Config": { "Cmd": [ "/bin/echo", "This container is intended for podman CI testing" ], "Env": [ "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" ], "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "WorkingDir": "/home/podman" }, "Created": "2021-06-10T18:55:43.049643585Z", "Digest": "sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "GraphDriver": { "Data": { "UpperDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/diff", "WorkDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/work" }, "Name": "overlay" }, "History": [ { "created": "2021-06-10T18:55:42.831917915Z", "created_by": "/bin/sh -c apk add busybox-extras", "empty_layer": true }, { "created": "2021-06-10T18:55:43.005956291Z", "created_by": "/bin/sh -c #(nop) ADD multi:0ed825786ec12498034356148303d2e6dfd4698131f4b5d4599e5eafa2ab71bd in /home/podman/ ", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006000972Z", "created_by": "/bin/sh -c #(nop) LABEL created_by=test/system/build-testimage", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006019818Z", "created_by": "/bin/sh -c #(nop) LABEL created_at=2021-06-10T18:55:36Z", "empty_layer": true }, { "created": "2021-06-10T18:55:43.028748885Z", "created_by": "/bin/sh -c #(nop) WORKDIR /home/podman", "empty_layer": true }, { "comment": "FROM docker.io/amd64/alpine:3.13.5", "created": "2021-06-10T18:55:43.160651456Z", "created_by": "/bin/sh -c #(nop) CMD [\"/bin/echo\", \"This container is intended for podman CI testing\"]" } ], "Id": "9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f", "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "ManifestType": "application/vnd.docker.distribution.manifest.v2+json", "NamesHistory": [ "quay.io/libpod/testimage:20210610" ], "Os": "linux", "Parent": "", "RepoDigests": [ "quay.io/libpod/testimage@sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "quay.io/libpod/testimage@sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" ], "RepoTags": [ "quay.io/libpod/testimage:20210610" ], "RootFS": { "Layers": [ "sha256:f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e" ], "Type": "layers" }, "Size": 7987860, "User": "", "Version": "", "VirtualSize": 7987860 } ], "item": "quay.io/libpod/testimage:20210610", "podman_actions": [ "/bin/podman image ls quay.io/libpod/testimage:20210610 --format json", "/bin/podman pull quay.io/libpod/testimage:20210610 -q", "/bin/podman inspect 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f --format json" ], "warnings": [ "Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually" ] } [WARNING]: Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually TASK [fedora.linux_system_roles.podman : Check the kubernetes yaml file] ******* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:51 Wednesday 27 July 2022 21:14:20 +0000 (0:00:02.058) 0:01:13.909 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "failed_when_result": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Ensure the kubernetes directory is present] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:59 Wednesday 27 July 2022 21:14:20 +0000 (0:00:00.391) 0:01:14.300 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "gid": 1001, "group": "user1", "mode": "0700", "owner": "user1", "path": "/home/user1/.config/containers/ansible-kubernetes.d", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 6, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure kubernetes yaml files are present] *** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:68 Wednesday 27 July 2022 21:14:20 +0000 (0:00:00.403) 0:01:14.704 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "checksum": "f36f5b3fd8752a059ae217d04f65ba46b054d773", "dest": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml", "gid": 1001, "group": "user1", "md5sum": "47ccac83d9de77d9645ae1ef4733269a", "mode": "0600", "owner": "user1", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 721, "src": "/root/.ansible/tmp/ansible-tmp-1658956460.9223995-89068-178272664634632/source", "state": "file", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Update containers/pods] *************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 Wednesday 27 July 2022 21:14:21 +0000 (0:00:00.717) 0:01:15.422 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) changed: [/cache/rhel-8.qcow2] => { "actions": [ "/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" ], "changed": true } STDOUT: Pod: 6280c4d6b1405c1d399d72fe527f90ab1ca6864bb75dc62dc29582f05597aba3 Container: dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 STDERR: time="2022-07-27T17:14:20-04:00" level=info msg="/bin/podman filtering at log level debug" time="2022-07-27T17:14:20-04:00" level=debug msg="Called kube.PersistentPreRunE(/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml)" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:14:20-04:00" level=debug msg="Merged system config \"/usr/share/containers/containers.conf\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Merged system config \"/etc/containers/containers.conf.d/50-systemroles.conf\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:14:20-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Initializing boltdb state at /home/user1/.local/share/containers/storage/libpod/bolt_state.db" time="2022-07-27T17:14:20-04:00" level=debug msg="Using graph driver overlay" time="2022-07-27T17:14:20-04:00" level=debug msg="Using graph root /home/user1/.local/share/containers/storage" time="2022-07-27T17:14:20-04:00" level=debug msg="Using run root /run/user/1001/containers" time="2022-07-27T17:14:20-04:00" level=debug msg="Using static dir /home/user1/.local/share/containers/storage/libpod" time="2022-07-27T17:14:20-04:00" level=debug msg="Using tmp dir /run/user/1001/libpod/tmp" time="2022-07-27T17:14:20-04:00" level=debug msg="Using volume path /home/user1/.local/share/containers/storage/volumes" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:14:20-04:00" level=debug msg="Set libpod namespace to \"\"" time="2022-07-27T17:14:20-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that native-diff is usable" time="2022-07-27T17:14:20-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T17:14:20-04:00" level=debug msg="Initializing event backend file" time="2022-07-27T17:14:20-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2022-07-27T17:14:20-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2022-07-27T17:14:20-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2022-07-27T17:14:20-04:00" level=debug msg="Configured OCI runtime crun initialization failed: no valid executable found for OCI runtime crun: invalid argument" time="2022-07-27T17:14:20-04:00" level=debug msg="Using OCI runtime \"/usr/bin/runc\"" time="2022-07-27T17:14:20-04:00" level=info msg="Setting parallel job count to 13" time="2022-07-27T17:14:20-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:20-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:20-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:20-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:20-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:20-04:00" level=debug msg="FROM \"scratch\"" time="2022-07-27T17:14:20-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T17:14:20-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T17:14:20-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T17:14:20-04:00" level=debug msg="overlay: test mount indicated that volatile is being used" time="2022-07-27T17:14:20-04:00" level=debug msg="overlay: mount_data=,lowerdir=/home/user1/.local/share/containers/storage/overlay/7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244/empty,upperdir=/home/user1/.local/share/containers/storage/overlay/7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244/diff,workdir=/home/user1/.local/share/containers/storage/overlay/7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244/work,userxattr,volatile,context=\"system_u:object_r:container_file_t:s0:c817,c991\"" time="2022-07-27T17:14:20-04:00" level=debug msg="Container ID: 715196a684269ed029ad58a2ab02256f0778ea569bcd2a29c7295e8c2823acc4" time="2022-07-27T17:14:20-04:00" level=debug msg="Parsed Step: {Env:[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin] Command:copy Args:[/usr/libexec/podman/catatonit /catatonit] Flags:[] Attrs:map[] Message:COPY /usr/libexec/podman/catatonit /catatonit Original:COPY /usr/libexec/podman/catatonit /catatonit}" time="2022-07-27T17:14:20-04:00" level=debug msg="COPY []string(nil), imagebuilder.Copy{FromFS:false, From:\"\", Src:[]string{\"/usr/libexec/podman/catatonit\"}, Dest:\"/catatonit\", Download:false, Chown:\"\", Chmod:\"\"}" time="2022-07-27T17:14:21-04:00" level=debug msg="added content file:93c5e27851a4b6d10dd2d9b1d99fbaba16388e80dfc49f83a02718d76b535656" time="2022-07-27T17:14:21-04:00" level=debug msg="Parsed Step: {Env:[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin] Command:entrypoint Args:[/catatonit -P] Flags:[] Attrs:map[json:true] Message:ENTRYPOINT /catatonit -P Original:ENTRYPOINT [\"/catatonit\", \"-P\"]}" time="2022-07-27T17:14:21-04:00" level=debug msg="COMMIT localhost/podman-pause:4.0.2-1648830555" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.0.2-1648830555\"" time="2022-07-27T17:14:21-04:00" level=debug msg="COMMIT \"containers-storage:[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.0.2-1648830555\"" time="2022-07-27T17:14:21-04:00" level=debug msg="committing image with reference \"containers-storage:[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.0.2-1648830555\" is allowed by policy" time="2022-07-27T17:14:21-04:00" level=debug msg="layer list: [\"7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244\"]" time="2022-07-27T17:14:21-04:00" level=debug msg="using \"/var/tmp/buildah1891714456\" to hold temporary data" time="2022-07-27T17:14:21-04:00" level=debug msg="Tar with options on /home/user1/.local/share/containers/storage/overlay/7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244/diff" time="2022-07-27T17:14:21-04:00" level=debug msg="layer \"7d22b0a37f7b394c4555fb21995cdf859edf617d355d07eb0f9eb572fc2dd244\" size is 737792 bytes, uncompressed digest sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877, possibly-compressed digest sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877" time="2022-07-27T17:14:21-04:00" level=debug msg="OCIv1 config = {\"created\":\"2022-07-27T21:14:21.114916664Z\",\"architecture\":\"amd64\",\"os\":\"linux\",\"config\":{\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Entrypoint\":[\"/catatonit\",\"-P\"],\"Labels\":{\"io.buildah.version\":\"1.24.1\"}},\"rootfs\":{\"type\":\"layers\",\"diff_ids\":[\"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\"]},\"history\":[{\"created\":\"2022-07-27T21:14:21.114280198Z\",\"created_by\":\"/bin/sh -c #(nop) COPY file:93c5e27851a4b6d10dd2d9b1d99fbaba16388e80dfc49f83a02718d76b535656 in /catatonit \",\"empty_layer\":true},{\"created\":\"2022-07-27T21:14:21.118438633Z\",\"created_by\":\"/bin/sh -c #(nop) ENTRYPOINT [\\\"/catatonit\\\", \\\"-P\\\"]\"}]}" time="2022-07-27T17:14:21-04:00" level=debug msg="OCIv1 manifest = {\"schemaVersion\":2,\"mediaType\":\"application/vnd.oci.image.manifest.v1+json\",\"config\":{\"mediaType\":\"application/vnd.oci.image.config.v1+json\",\"digest\":\"sha256:8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\",\"size\":668},\"layers\":[{\"mediaType\":\"application/vnd.oci.image.layer.v1.tar\",\"digest\":\"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\",\"size\":737792}],\"annotations\":{\"org.opencontainers.image.base.digest\":\"\",\"org.opencontainers.image.base.name\":\"\"}}" time="2022-07-27T17:14:21-04:00" level=debug msg="Docker v2s2 config = {\"created\":\"2022-07-27T21:14:21.114916664Z\",\"container\":\"715196a684269ed029ad58a2ab02256f0778ea569bcd2a29c7295e8c2823acc4\",\"container_config\":{\"Hostname\":\"\",\"Domainname\":\"\",\"User\":\"\",\"AttachStdin\":false,\"AttachStdout\":false,\"AttachStderr\":false,\"Tty\":false,\"OpenStdin\":false,\"StdinOnce\":false,\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Cmd\":[],\"Image\":\"\",\"Volumes\":{},\"WorkingDir\":\"\",\"Entrypoint\":[\"/catatonit\",\"-P\"],\"OnBuild\":[],\"Labels\":{\"io.buildah.version\":\"1.24.1\"}},\"config\":{\"Hostname\":\"\",\"Domainname\":\"\",\"User\":\"\",\"AttachStdin\":false,\"AttachStdout\":false,\"AttachStderr\":false,\"Tty\":false,\"OpenStdin\":false,\"StdinOnce\":false,\"Env\":[\"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\"],\"Cmd\":[],\"Image\":\"\",\"Volumes\":{},\"WorkingDir\":\"\",\"Entrypoint\":[\"/catatonit\",\"-P\"],\"OnBuild\":[],\"Labels\":{\"io.buildah.version\":\"1.24.1\"}},\"architecture\":\"amd64\",\"os\":\"linux\",\"rootfs\":{\"type\":\"layers\",\"diff_ids\":[\"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\"]},\"history\":[{\"created\":\"2022-07-27T21:14:21.114280198Z\",\"created_by\":\"/bin/sh -c #(nop) COPY file:93c5e27851a4b6d10dd2d9b1d99fbaba16388e80dfc49f83a02718d76b535656 in /catatonit \",\"empty_layer\":true},{\"created\":\"2022-07-27T21:14:21.118438633Z\",\"created_by\":\"/bin/sh -c #(nop) ENTRYPOINT [\\\"/catatonit\\\", \\\"-P\\\"]\"}]}" time="2022-07-27T17:14:21-04:00" level=debug msg="Docker v2s2 manifest = {\"schemaVersion\":2,\"mediaType\":\"application/vnd.docker.distribution.manifest.v2+json\",\"config\":{\"mediaType\":\"application/vnd.docker.container.image.v1+json\",\"size\":1342,\"digest\":\"sha256:dec1fc09e2979c1013b2ac33457f800abec524f873f8766f10403000787dea77\"},\"layers\":[{\"mediaType\":\"application/vnd.docker.image.rootfs.diff.tar\",\"size\":737792,\"digest\":\"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\"}]}" time="2022-07-27T17:14:21-04:00" level=debug msg="Using blob info cache at /home/user1/.local/share/containers/cache/blob-info-cache-v1.boltdb" time="2022-07-27T17:14:21-04:00" level=debug msg="IsRunningImageAllowed for image containers-storage:" time="2022-07-27T17:14:21-04:00" level=debug msg=" Using transport \"containers-storage\" policy section " time="2022-07-27T17:14:21-04:00" level=debug msg=" Requirement 0: allowed" time="2022-07-27T17:14:21-04:00" level=debug msg="Overall: allowed" time="2022-07-27T17:14:21-04:00" level=debug msg="start reading config" time="2022-07-27T17:14:21-04:00" level=debug msg="finished reading config" time="2022-07-27T17:14:21-04:00" level=debug msg="Manifest has MIME type application/vnd.oci.image.manifest.v1+json, ordered candidate list [application/vnd.oci.image.manifest.v1+json, application/vnd.docker.distribution.manifest.v2+json, application/vnd.docker.distribution.manifest.v1+prettyjws, application/vnd.docker.distribution.manifest.v1+json]" time="2022-07-27T17:14:21-04:00" level=debug msg="... will first try using the original manifest unmodified" time="2022-07-27T17:14:21-04:00" level=debug msg="reading layer \"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\"" time="2022-07-27T17:14:21-04:00" level=debug msg="No compression detected" time="2022-07-27T17:14:21-04:00" level=debug msg="Using original blob without modification" time="2022-07-27T17:14:21-04:00" level=debug msg="Applying tar in /home/user1/.local/share/containers/storage/overlay/87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877/diff" time="2022-07-27T17:14:21-04:00" level=debug msg="finished reading layer \"sha256:87abffabf5c94712c1755121a49df666a752d6e5e66b2becc47ca5cc73905877\"" time="2022-07-27T17:14:21-04:00" level=debug msg="No compression detected" time="2022-07-27T17:14:21-04:00" level=debug msg="Using original blob without modification" time="2022-07-27T17:14:21-04:00" level=debug msg="setting image creation date to 2022-07-27 21:14:21.114916664 +0000 UTC" time="2022-07-27T17:14:21-04:00" level=debug msg="created new image ID \"8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="saved image metadata \"{\\\"signatures-sizes\\\":{\\\"sha256:c184a434110486acbe2e1291207d1655c93b74d94f4f5cc32177a5942b657d2d\\\":[]}}\"" time="2022-07-27T17:14:21-04:00" level=debug msg="added name \"localhost/podman-pause:4.0.2-1648830555\" to image \"8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"localhost/podman-pause:4.0.2-1648830555\" as \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"localhost/podman-pause:4.0.2-1648830555\" as \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889)" time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]localhost/podman-pause:4.0.2-1648830555\"" time="2022-07-27T17:14:21-04:00" level=debug msg="printing final image id \"8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Pod will use slirp4netns" time="2022-07-27T17:14:21-04:00" level=debug msg="Got pod cgroup as /libpod_parent/6280c4d6b1405c1d399d72fe527f90ab1ca6864bb75dc62dc29582f05597aba3" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"localhost/podman-pause:4.0.2-1648830555\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"localhost/podman-pause:4.0.2-1648830555\" as \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"localhost/podman-pause:4.0.2-1648830555\" as \"localhost/podman-pause:4.0.2-1648830555\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889)" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889" time="2022-07-27T17:14:21-04:00" level=debug msg="using systemd mode: false" time="2022-07-27T17:14:21-04:00" level=debug msg="setting container name 6280c4d6b140-infra" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Allocated lock 1 for container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2" time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:8ec1b006f9bad3b61587ee42391082ad57df349c351784b1cc4f7517fe00c889\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Created container \"29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Container \"29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Container \"29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2\" has run directory \"/run/user/1001/containers/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:14:21-04:00" level=debug msg="Pulling image quay.io/libpod/testimage:20210610 (policy: newer)" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/000-shortnames.conf\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/001-rhel-shortnames.conf\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/002-rhel-shortnames-overrides.conf\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/50-systemroles.conf\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Attempting to pull candidate quay.io/libpod/testimage:20210610 for quay.io/libpod/testimage:20210610" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying to access \"quay.io/libpod/testimage:20210610\"" time="2022-07-27T17:14:21-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /run/user/1001/containers/auth.json" time="2022-07-27T17:14:21-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.config/containers/auth.json" time="2022-07-27T17:14:21-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.docker/config.json" time="2022-07-27T17:14:21-04:00" level=debug msg="No credentials matching quay.io/libpod/testimage found in /home/user1/.dockercfg" time="2022-07-27T17:14:21-04:00" level=debug msg="No credentials for quay.io/libpod/testimage found" time="2022-07-27T17:14:21-04:00" level=debug msg="Using registries.d directory /etc/containers/registries.d for sigstore configuration" time="2022-07-27T17:14:21-04:00" level=debug msg=" Using \"default-docker\" configuration" time="2022-07-27T17:14:21-04:00" level=debug msg=" No signature storage configuration found for quay.io/libpod/testimage:20210610, using built-in default file:///home/user1/.local/share/containers/sigstore" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking for TLS certificates and private keys in /etc/docker/certs.d/quay.io" time="2022-07-27T17:14:21-04:00" level=debug msg="GET https://quay.io/v2/" time="2022-07-27T17:14:21-04:00" level=debug msg="Ping https://quay.io/v2/ status 401" time="2022-07-27T17:14:21-04:00" level=debug msg="GET https://quay.io/v2/auth?scope=repository%3Alibpod%2Ftestimage%3Apull&service=quay.io" time="2022-07-27T17:14:21-04:00" level=debug msg="Increasing token expiration to: 60 seconds" time="2022-07-27T17:14:21-04:00" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/20210610" time="2022-07-27T17:14:21-04:00" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.list.v2+json\"" time="2022-07-27T17:14:21-04:00" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" time="2022-07-27T17:14:21-04:00" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.v2+json\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Skipping pull candidate quay.io/libpod/testimage:20210610 as the image is not newer (pull policy newer)" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2022-07-27T17:14:21-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T17:14:21-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T17:14:21-04:00" level=debug msg="using systemd mode: false" time="2022-07-27T17:14:21-04:00" level=debug msg="adding container to pod httpd1" time="2022-07-27T17:14:21-04:00" level=debug msg="setting container name httpd1-httpd1" time="2022-07-27T17:14:21-04:00" level=debug msg="Loading default seccomp profile" time="2022-07-27T17:14:21-04:00" level=info msg="Sysctl net.ipv4.ping_group_range=0 0 ignored in containers.conf, since Network Namespace set to host" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /proc" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /dev" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /dev/pts" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /dev/mqueue" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /sys" time="2022-07-27T17:14:21-04:00" level=debug msg="Adding mount /sys/fs/cgroup" time="2022-07-27T17:14:21-04:00" level=debug msg="Allocated lock 2 for container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1" time="2022-07-27T17:14:21-04:00" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Created container \"dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Container \"dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Container \"dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1\" has run directory \"/run/user/1001/containers/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Strongconnecting node 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2" time="2022-07-27T17:14:21-04:00" level=debug msg="Pushed 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 onto stack" time="2022-07-27T17:14:21-04:00" level=debug msg="Finishing node 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2. Popped 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 off stack" time="2022-07-27T17:14:21-04:00" level=debug msg="Strongconnecting node dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1" time="2022-07-27T17:14:21-04:00" level=debug msg="Pushed dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 onto stack" time="2022-07-27T17:14:21-04:00" level=debug msg="Finishing node dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1. Popped dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 off stack" time="2022-07-27T17:14:21-04:00" level=debug msg="overlay: mount_data=,lowerdir=/home/user1/.local/share/containers/storage/overlay/l/R7TWEURKUBCRVQQKVHCSMZG7BR,upperdir=/home/user1/.local/share/containers/storage/overlay/a61f20c97dcac921d79019a8ad8833f84649b2bc0c2a8f12ef102248099efe56/diff,workdir=/home/user1/.local/share/containers/storage/overlay/a61f20c97dcac921d79019a8ad8833f84649b2bc0c2a8f12ef102248099efe56/work,userxattr,context=\"system_u:object_r:container_file_t:s0:c48,c504\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Mounted container \"29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2\" at \"/home/user1/.local/share/containers/storage/overlay/a61f20c97dcac921d79019a8ad8833f84649b2bc0c2a8f12ef102248099efe56/merged\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Created root filesystem for container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 at /home/user1/.local/share/containers/storage/overlay/a61f20c97dcac921d79019a8ad8833f84649b2bc0c2a8f12ef102248099efe56/merged" time="2022-07-27T17:14:21-04:00" level=debug msg="Made network namespace at /run/user/1001/netns/netns-761618e5-1eea-f5e2-a318-087218d9916f for container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2" time="2022-07-27T17:14:21-04:00" level=debug msg="slirp4netns command: /bin/slirp4netns --disable-host-loopback --mtu=65520 --enable-sandbox --enable-seccomp --enable-ipv6 -c -e 3 -r 4 --netns-type=path /run/user/1001/netns/netns-761618e5-1eea-f5e2-a318-087218d9916f tap0" time="2022-07-27T17:14:21-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:14:21-04:00\" level=info msg=\"Starting parent driver\"\n" time="2022-07-27T17:14:21-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:14:21-04:00\" level=info msg=\"opaque=map[builtin.readypipepath:/run/user/1001/libpod/tmp/rootlessport2316983053/.bp-ready.pipe builtin.socketpath:/run/user/1001/libpod/tmp/rootlessport2316983053/.bp.sock]\"\n" time="2022-07-27T17:14:21-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:14:21-04:00\" level=info msg=\"Starting child driver in child netns (\\\"/proc/self/exe\\\" [rootlessport-child])\"\n" time="2022-07-27T17:14:21-04:00" level=debug msg="rootlessport: time=\"2022-07-27T17:14:21-04:00\" level=info msg=\"Waiting for initComplete\"\n" time="2022-07-27T17:14:21-04:00" level=debug msg="rootlessport is ready" time="2022-07-27T17:14:21-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2022-07-27T17:14:21-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2022-07-27T17:14:21-04:00" level=debug msg="Workdir \"/\" resolved to host path \"/home/user1/.local/share/containers/storage/overlay/a61f20c97dcac921d79019a8ad8833f84649b2bc0c2a8f12ef102248099efe56/merged\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Created OCI spec for container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 at /home/user1/.local/share/containers/storage/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata/config.json" time="2022-07-27T17:14:21-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2022-07-27T17:14:21-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 -u 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 -r /usr/bin/runc -b /home/user1/.local/share/containers/storage/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata -p /run/user/1001/containers/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata/pidfile -n 6280c4d6b140-infra --exit-dir /run/user/1001/libpod/tmp/exits --full-attach -l k8s-file:/home/user1/.local/share/containers/storage/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata/ctr.log --log-level debug --syslog --conmon-pidfile /run/user/1001/containers/overlay-containers/29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /home/user1/.local/share/containers/storage --exit-command-arg --runroot --exit-command-arg /run/user/1001/containers --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg cgroupfs --exit-command-arg --tmpdir --exit-command-arg /run/user/1001/libpod/tmp --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg cni --exit-command-arg --runtime --exit-command-arg runc --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --events-backend --exit-command-arg file --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2]" time="2022-07-27T17:14:21-04:00" level=info msg="Failed to add conmon to cgroupfs sandbox cgroup: error creating cgroup for cpu: mkdir /sys/fs/cgroup/cpu/libpod_parent: permission denied" [conmon:d]: failed to write to /proc/self/oom_score_adj: Permission denied time="2022-07-27T17:14:21-04:00" level=debug msg="Received: 20740" time="2022-07-27T17:14:21-04:00" level=info msg="Got Conmon PID as 20728" time="2022-07-27T17:14:21-04:00" level=debug msg="Created container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 in OCI runtime" time="2022-07-27T17:14:21-04:00" level=debug msg="Starting container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2 with command [/catatonit -P]" time="2022-07-27T17:14:21-04:00" level=debug msg="Started container 29405e37a87d7bcf7f06907173bb5d02788a5f1b4d5d6c81e65455bc138305b2" time="2022-07-27T17:14:21-04:00" level=debug msg="overlay: mount_data=,lowerdir=/home/user1/.local/share/containers/storage/overlay/l/S6HFSEV4HHN2VDT5WTDTIHFOLI,upperdir=/home/user1/.local/share/containers/storage/overlay/61672f5c77b7f5ccffe3c40e83566f81ee4db66a20f8dd3efac1c3828b02b021/diff,workdir=/home/user1/.local/share/containers/storage/overlay/61672f5c77b7f5ccffe3c40e83566f81ee4db66a20f8dd3efac1c3828b02b021/work,userxattr,context=\"system_u:object_r:container_file_t:s0:c48,c504\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Mounted container \"dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1\" at \"/home/user1/.local/share/containers/storage/overlay/61672f5c77b7f5ccffe3c40e83566f81ee4db66a20f8dd3efac1c3828b02b021/merged\"" time="2022-07-27T17:14:21-04:00" level=debug msg="Created root filesystem for container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 at /home/user1/.local/share/containers/storage/overlay/61672f5c77b7f5ccffe3c40e83566f81ee4db66a20f8dd3efac1c3828b02b021/merged" time="2022-07-27T17:14:21-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2022-07-27T17:14:21-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2022-07-27T17:14:21-04:00" level=debug msg="Workdir \"/var/www\" resolved to a volume or mount" time="2022-07-27T17:14:21-04:00" level=debug msg="Created OCI spec for container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 at /home/user1/.local/share/containers/storage/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata/config.json" time="2022-07-27T17:14:21-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2022-07-27T17:14:21-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 -u dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 -r /usr/bin/runc -b /home/user1/.local/share/containers/storage/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata -p /run/user/1001/containers/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata/pidfile -n httpd1-httpd1 --exit-dir /run/user/1001/libpod/tmp/exits --full-attach -l k8s-file:/home/user1/.local/share/containers/storage/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata/ctr.log --log-level debug --syslog --conmon-pidfile /run/user/1001/containers/overlay-containers/dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /home/user1/.local/share/containers/storage --exit-command-arg --runroot --exit-command-arg /run/user/1001/containers --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg cgroupfs --exit-command-arg --tmpdir --exit-command-arg /run/user/1001/libpod/tmp --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg cni --exit-command-arg --runtime --exit-command-arg runc --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --events-backend --exit-command-arg file --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1]" time="2022-07-27T17:14:21-04:00" level=info msg="Failed to add conmon to cgroupfs sandbox cgroup: error creating cgroup for cpuset: mkdir /sys/fs/cgroup/cpuset/conmon: permission denied" [conmon:d]: failed to write to /proc/self/oom_score_adj: Permission denied time="2022-07-27T17:14:21-04:00" level=debug msg="Received: 20765" time="2022-07-27T17:14:21-04:00" level=info msg="Got Conmon PID as 20753" time="2022-07-27T17:14:21-04:00" level=debug msg="Created container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 in OCI runtime" time="2022-07-27T17:14:21-04:00" level=debug msg="Starting container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1 with command [/bin/busybox-extras httpd -f -p 80]" time="2022-07-27T17:14:21-04:00" level=debug msg="Started container dcf2e93d798ca1b1515142288e5d2259524539646c7438651efb5e707710e0a1" time="2022-07-27T17:14:21-04:00" level=debug msg="Called kube.PersistentPostRunE(/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml)" TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:86 Wednesday 27 July 2022 21:14:23 +0000 (0:00:01.753) 0:01:17.176 ******** ok: [/cache/rhel-8.qcow2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Enable service] *********************** task path: /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:98 Wednesday 27 July 2022 21:14:23 +0000 (0:00:00.567) 0:01:17.744 ******** fatal: [/cache/rhel-8.qcow2]: FAILED! => { "changed": false } MSG: Could not find the requested service podman-kube@-home-user1-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service: host TASK [Clean up storage.conf] *************************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:346 Wednesday 27 July 2022 21:14:24 +0000 (0:00:00.534) 0:01:18.278 ******** changed: [/cache/rhel-8.qcow2] => { "changed": true, "path": "/etc/containers/storage.conf", "state": "absent" } TASK [Clean up host directories] *********************************************** task path: /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:353 Wednesday 27 July 2022 21:14:24 +0000 (0:00:00.374) 0:01:18.653 ******** changed: [/cache/rhel-8.qcow2] => (item=httpd1) => { "ansible_loop_var": "item", "changed": true, "item": "httpd1", "path": "/tmp/httpd1", "state": "absent" } changed: [/cache/rhel-8.qcow2] => (item=httpd2) => { "ansible_loop_var": "item", "changed": true, "item": "httpd2", "path": "/tmp/httpd2", "state": "absent" } changed: [/cache/rhel-8.qcow2] => (item=httpd3) => { "ansible_loop_var": "item", "changed": true, "item": "httpd3", "path": "/tmp/httpd3", "state": "absent" } PLAY RECAP ********************************************************************* /cache/rhel-8.qcow2 : ok=57 changed=25 unreachable=0 failed=1 skipped=30 rescued=0 ignored=0 Wednesday 27 July 2022 21:14:25 +0000 (0:00:01.106) 0:01:19.760 ******** =============================================================================== Install podman from updates-testing ------------------------------------ 39.99s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:139 ----------------------------- Enable podman copr ------------------------------------------------------ 2.51s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:136 ----------------------------- fedora.linux_system_roles.firewall : Install firewalld ------------------ 2.45s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 fedora.linux_system_roles.selinux : Set an SELinux label on a port ------ 2.31s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 fedora.linux_system_roles.podman : Ensure container images are present --- 2.06s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 Create data files ------------------------------------------------------- 2.06s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:171 ----------------------------- fedora.linux_system_roles.podman : Update containers/pods --------------- 1.75s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 fedora.linux_system_roles.podman : Ensure required packages are installed --- 1.54s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 set up internal repositories -------------------------------------------- 1.34s /cache/rhel-8_setup.yml:5 ----------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.30s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:3 ------------------------------- fedora.linux_system_roles.selinux : Install SELinux tool semanage ------- 1.27s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 fedora.linux_system_roles.firewall : Install python3-firewall ----------- 1.26s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 fedora.linux_system_roles.selinux : Install SELinux python3 tools ------- 1.24s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 Create host directories for data ---------------------------------------- 1.24s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:163 ----------------------------- Clean up host directories ----------------------------------------------- 1.11s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:353 ----------------------------- fedora.linux_system_roles.firewall : Enable and start firewalld service --- 1.03s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 Create user ------------------------------------------------------------- 0.91s /tmp/tmpmow9d4_c/tests/podman/tests_basic.yml:145 ----------------------------- fedora.linux_system_roles.firewall : Configure firewall ----------------- 0.82s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:36 fedora.linux_system_roles.selinux : refresh facts ----------------------- 0.80s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 fedora.linux_system_roles.podman : Create host directories -------------- 0.77s /tmp/tmprng1zc_n/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24