Formatting '/cache/fedora-35.qcow2.snap', fmt=qcow2 cluster_size=65536 extended_l2=off compression_type=zlib size=4294967296 backing_file=/cache/fedora-35.qcow2 backing_fmt=qcow2 lazy_refcounts=off refcount_bits=16 ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpkx6j_hnm executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: fedora-35_setup.yml ************************************************** 1 plays in /cache/fedora-35_setup.yml PLAY [Set up host for test playbooks] ****************************************** TASK [Gathering Facts] ********************************************************* task path: /cache/fedora-35_setup.yml:1 Wednesday 27 July 2022 21:15:32 +0000 (0:00:00.012) 0:00:00.012 ******** ok: [/cache/fedora-35.qcow2.snap] META: ran handlers TASK [Create EPEL 35 repo] ***************************************************** task path: /cache/fedora-35_setup.yml:5 Wednesday 27 July 2022 21:15:33 +0000 (0:00:01.179) 0:00:01.192 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Create yum cache] ******************************************************** task path: /cache/fedora-35_setup.yml:15 Wednesday 27 July 2022 21:15:33 +0000 (0:00:00.019) 0:00:01.212 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Create dnf cache] ******************************************************** task path: /cache/fedora-35_setup.yml:21 Wednesday 27 July 2022 21:15:33 +0000 (0:00:00.019) 0:00:01.232 ******** changed: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [Disable EPEL 7] ********************************************************** task path: /cache/fedora-35_setup.yml:27 Wednesday 27 July 2022 21:16:17 +0000 (0:00:44.389) 0:00:45.621 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Disable EPEL 8] ********************************************************** task path: /cache/fedora-35_setup.yml:35 Wednesday 27 July 2022 21:16:17 +0000 (0:00:00.023) 0:00:45.645 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/fedora-35.qcow2.snap : ok=2 changed=1 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 Wednesday 27 July 2022 21:16:17 +0000 (0:00:00.028) 0:00:45.674 ******** =============================================================================== Create dnf cache ------------------------------------------------------- 44.39s /cache/fedora-35_setup.yml:21 ------------------------------------------------- Gathering Facts --------------------------------------------------------- 1.18s /cache/fedora-35_setup.yml:1 -------------------------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.03s /cache/fedora-35_setup.yml:35 ------------------------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:27 ------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:15 ------------------------------------------------- Create EPEL 35 repo ----------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:5 -------------------------------------------------- PLAYBOOK: setup-snapshot.yml *************************************************** 1 plays in /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml PLAY [all] ********************************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:2 Wednesday 27 July 2022 21:16:17 +0000 (0:00:00.011) 0:00:45.686 ******** ok: [/cache/fedora-35.qcow2.snap] META: ran handlers TASK [Set platform/version specific variables] ********************************* task path: /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:4 Wednesday 27 July 2022 21:16:18 +0000 (0:00:00.792) 0:00:46.478 ******** TASK [linux-system-roles.podman : Ensure ansible_facts used by role] *********** task path: /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 Wednesday 27 July 2022 21:16:18 +0000 (0:00:00.034) 0:00:46.513 ******** ok: [/cache/fedora-35.qcow2.snap] TASK [linux-system-roles.podman : Set platform/version specific variables] ***** task path: /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 Wednesday 27 July 2022 21:16:19 +0000 (0:00:00.536) 0:00:47.049 ******** skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_35.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_35.yml", "skip_reason": "Conditional result was False" } META: role_complete for /cache/fedora-35.qcow2.snap TASK [Install test packages] *************************************************** task path: /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:10 Wednesday 27 July 2022 21:16:19 +0000 (0:00:00.048) 0:00:47.097 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "rc": 0, "results": [ "Installed: netavark-1.0.3-1.fc35.x86_64", "Installed: fuse3-3.10.5-1.fc35.x86_64", "Installed: fuse3-libs-3.10.5-1.fc35.x86_64", "Installed: libslirp-4.6.1-2.fc35.x86_64", "Installed: crun-1.4.5-1.fc35.x86_64", "Installed: libnftnl-1.2.0-2.fc35.x86_64", "Installed: nftables-1:1.0.0-1.fc35.x86_64", "Installed: conmon-2:2.1.0-2.fc35.x86_64", "Installed: container-selinux-2:2.189.0-1.fc35.noarch", "Installed: libbsd-0.10.0-8.fc35.x86_64", "Installed: containernetworking-plugins-1.1.0-1.fc35.x86_64", "Installed: containers-common-4:1-45.fc35.noarch", "Installed: jansson-2.13.1-3.fc35.x86_64", "Installed: dnsmasq-2.86-6.fc35.x86_64", "Installed: aardvark-dns-1.0.3-1.fc35.x86_64", "Installed: fuse-overlayfs-1.9-1.fc35.x86_64", "Installed: shadow-utils-subid-2:4.9-9.fc35.x86_64", "Installed: slirp4netns-1.1.12-2.fc35.x86_64", "Installed: libnet-1.2-4.fc35.x86_64", "Installed: podman-3:3.4.7-2.fc35.x86_64", "Installed: iptables-legacy-1.8.7-13.fc35.x86_64", "Installed: podman-gvproxy-3:3.4.7-2.fc35.x86_64", "Installed: podman-plugins-3:3.4.7-2.fc35.x86_64", "Installed: fuse-common-3.10.5-1.fc35.x86_64", "Installed: catatonit-0.1.7-1.fc35.x86_64", "Installed: yajl-2.1.0-17.fc35.x86_64", "Installed: criu-3.16-2.fc35.x86_64", "Installed: criu-libs-3.16-2.fc35.x86_64" ] } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/fedora-35.qcow2.snap : ok=5 changed=2 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 Wednesday 27 July 2022 21:17:00 +0000 (0:00:41.289) 0:01:28.386 ******** =============================================================================== Create dnf cache ------------------------------------------------------- 44.39s /cache/fedora-35_setup.yml:21 ------------------------------------------------- Install test packages -------------------------------------------------- 41.29s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:10 ---------------------------------- Gathering Facts --------------------------------------------------------- 1.18s /cache/fedora-35_setup.yml:1 -------------------------------------------------- Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:2 ----------------------------------- linux-system-roles.podman : Ensure ansible_facts used by role ----------- 0.54s /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 --- linux-system-roles.podman : Set platform/version specific variables ----- 0.05s /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 --- Set platform/version specific variables --------------------------------- 0.03s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:4 ----------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.03s /cache/fedora-35_setup.yml:35 ------------------------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:27 ------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:15 ------------------------------------------------- Create EPEL 35 repo ----------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:5 -------------------------------------------------- PLAYBOOK: fedora-35_post_setup.yml ********************************************* 1 plays in /cache/fedora-35_post_setup.yml PLAY [Post setup - these happen last] ****************************************** META: ran handlers TASK [force sync of filesystems - ensure setup changes are made to snapshot] *** task path: /cache/fedora-35_post_setup.yml:5 Wednesday 27 July 2022 21:17:00 +0000 (0:00:00.018) 0:01:28.405 ******** changed: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } TASK [shutdown guest] ********************************************************** task path: /cache/fedora-35_post_setup.yml:8 Wednesday 27 July 2022 21:17:00 +0000 (0:00:00.431) 0:01:28.837 ******** changed: [/cache/fedora-35.qcow2.snap] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": true } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* /cache/fedora-35.qcow2.snap : ok=7 changed=4 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 Wednesday 27 July 2022 21:17:01 +0000 (0:00:00.550) 0:01:29.387 ******** =============================================================================== Create dnf cache ------------------------------------------------------- 44.39s /cache/fedora-35_setup.yml:21 ------------------------------------------------- Install test packages -------------------------------------------------- 41.29s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:10 ---------------------------------- Gathering Facts --------------------------------------------------------- 1.18s /cache/fedora-35_setup.yml:1 -------------------------------------------------- Gathering Facts --------------------------------------------------------- 0.79s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:2 ----------------------------------- shutdown guest ---------------------------------------------------------- 0.55s /cache/fedora-35_post_setup.yml:8 --------------------------------------------- linux-system-roles.podman : Ensure ansible_facts used by role ----------- 0.54s /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:3 --- force sync of filesystems - ensure setup changes are made to snapshot --- 0.43s /cache/fedora-35_post_setup.yml:5 --------------------------------------------- linux-system-roles.podman : Set platform/version specific variables ----- 0.05s /tmp/tmp8qw1qz5f/tests/roles/linux-system-roles.podman/tasks/set_vars.yml:9 --- Set platform/version specific variables --------------------------------- 0.03s /tmp/tmp8qw1qz5f/tests/setup-snapshot.yml:4 ----------------------------------- Disable EPEL 8 ---------------------------------------------------------- 0.03s /cache/fedora-35_setup.yml:35 ------------------------------------------------- Disable EPEL 7 ---------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:27 ------------------------------------------------- Create yum cache -------------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:15 ------------------------------------------------- Create EPEL 35 repo ----------------------------------------------------- 0.02s /cache/fedora-35_setup.yml:5 -------------------------------------------------- ansible-playbook [core 2.12.6] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /tmp/tmpkx6j_hnm executable location = /usr/bin/ansible-playbook python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)] jinja version = 2.11.3 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_basic.yml ****************************************************** 1 plays in /tmp/tmp1te44sz5/tests/podman/tests_basic.yml PLAY [Ensure that the role runs with default parameters] *********************** TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:3 Wednesday 27 July 2022 21:17:42 +0000 (0:00:00.013) 0:00:00.013 ******** ok: [/cache/fedora-35.qcow2.snap] META: ran handlers TASK [Enable podman copr] ****************************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:136 Wednesday 27 July 2022 21:17:44 +0000 (0:00:01.182) 0:00:01.196 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "cmd": [ "dnf", "copr", "enable", "rhcontainerbot/podman-next", "-y" ], "delta": "0:00:00.456057", "end": "2022-07-27 21:17:44.588478", "rc": 0, "start": "2022-07-27 21:17:44.132421" } STDOUT: Repository successfully enabled. STDERR: Enabling a Copr repository. Please note that this repository is not part of the main distribution, and quality may vary. The Fedora Project does not exercise any power over the contents of this repository beyond the rules outlined in the Copr FAQ at , and packages are not held to any quality or security level. Please do not file bug reports about these packages in Fedora Bugzilla. In case of problems, contact the owner of this repository. TASK [Install podman from updates-testing] ************************************* task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:139 Wednesday 27 July 2022 21:17:45 +0000 (0:00:00.931) 0:00:02.127 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "cmd": [ "dnf", "-y", "install", "podman" ], "delta": "0:00:03.265504", "end": "2022-07-27 21:17:48.221472", "rc": 0, "start": "2022-07-27 21:17:44.955968" } STDOUT: Copr repo for podman-next owned by rhcontainerb 14 MB/s | 2.9 MB 00:00 Last metadata expiration check: 0:00:01 ago on Wed 27 Jul 2022 09:17:45 PM UTC. Package podman-3:3.4.7-2.fc35.x86_64 is already installed. Dependencies resolved. Nothing to do. Complete! TASK [Podman version] ********************************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:142 Wednesday 27 July 2022 21:17:48 +0000 (0:00:03.642) 0:00:05.770 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "cmd": [ "podman", "--version" ], "delta": "0:00:00.092099", "end": "2022-07-27 21:17:48.703504", "rc": 0, "start": "2022-07-27 21:17:48.611405" } STDOUT: podman version 3.4.7 TASK [Create user] ************************************************************* task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:145 Wednesday 27 July 2022 21:17:49 +0000 (0:00:00.473) 0:00:06.243 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "comment": "", "create_home": true, "group": 1001, "home": "/home/user1", "name": "user1", "shell": "/bin/bash", "state": "present", "system": false, "uid": 1001 } TASK [Create tempfile for kube_src] ******************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:149 Wednesday 27 July 2022 21:17:49 +0000 (0:00:00.664) 0:00:06.907 ******** changed: [/cache/fedora-35.qcow2.snap -> localhost] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/lsr_podman_v5rizl4i.yml", "size": 0, "state": "file", "uid": 0 } TASK [Write kube_file_src] ***************************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:157 Wednesday 27 July 2022 21:17:50 +0000 (0:00:00.326) 0:00:07.233 ******** changed: [/cache/fedora-35.qcow2.snap -> localhost] => { "changed": true, "checksum": "7c999d33fe2b60b3c65ec0a85b8924cc4e970d83", "dest": "/tmp/lsr_podman_v5rizl4i.yml", "gid": 0, "group": "root", "md5sum": "4807f59df21780236988d1eb89142aa2", "mode": "0600", "owner": "root", "size": 665, "src": "/root/.ansible/tmp/ansible-tmp-1658956670.138877-90146-60248820358235/source", "state": "file", "uid": 0 } TASK [Create host directories for data] **************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:163 Wednesday 27 July 2022 21:17:50 +0000 (0:00:00.628) 0:00:07.862 ******** changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "mode": "0755", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 40, "state": "directory", "uid": 1001 } changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd2", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 40, "state": "directory", "uid": 0 } changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "mode": "0755", "owner": "root", "path": "/tmp/httpd3", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 40, "state": "directory", "uid": 0 } TASK [Create data files] ******************************************************* task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:171 Wednesday 27 July 2022 21:17:51 +0000 (0:00:01.254) 0:00:09.116 ******** changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd1', 'user1', 1001]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd1/index.txt", "gid": 0, "group": "root", "item": [ "httpd1", "user1", 1001 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "user1", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956672.0586882-90211-6915400208898/source", "state": "file", "uid": 1001 } changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd2', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd2/index.txt", "gid": 0, "group": "root", "item": [ "httpd2", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956672.722295-90211-111760806564456/source", "state": "file", "uid": 0 } changed: [/cache/fedora-35.qcow2.snap] => (item=['httpd3', 'root', 0]) => { "ansible_loop_var": "item", "changed": true, "checksum": "40bd001563085fc35165329ea1ff5c5ecbdbbeef", "dest": "/tmp/httpd3/index.txt", "gid": 0, "group": "root", "item": [ "httpd3", "root", 0 ], "md5sum": "202cb962ac59075b964b07152d234b70", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 3, "src": "/root/.ansible/tmp/ansible-tmp-1658956673.3965738-90211-206217566847166/source", "state": "file", "uid": 0 } TASK [Run role] **************************************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:179 Wednesday 27 July 2022 21:17:54 +0000 (0:00:02.034) 0:00:11.150 ******** TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Wednesday 27 July 2022 21:17:54 +0000 (0:00:00.044) 0:00:11.194 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Wednesday 27 July 2022 21:17:54 +0000 (0:00:00.030) 0:00:11.225 ******** ok: [/cache/fedora-35.qcow2.snap] TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:8 Wednesday 27 July 2022 21:17:54 +0000 (0:00:00.546) 0:00:11.771 ******** skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_35.yml", "skip_reason": "Conditional result was False" } skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml) => { "ansible_loop_var": "item", "changed": false, "item": "Fedora_35.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Wednesday 27 July 2022 21:17:54 +0000 (0:00:00.041) 0:00:11.813 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:11 Wednesday 27 July 2022 21:17:56 +0000 (0:00:02.279) 0:00:14.092 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - system] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:3 Wednesday 27 July 2022 21:17:57 +0000 (0:00:00.034) 0:00:14.127 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/containers.conf.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 0, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure containers.d exists - user] **** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:12 Wednesday 27 July 2022 21:17:57 +0000 (0:00:00.443) 0:00:14.570 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system container config file] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:21 Wednesday 27 July 2022 21:17:57 +0000 (0:00:00.022) 0:00:14.592 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "checksum": "c56b033cc4634627f8794d6ccf07a051e0820c07", "dest": "/etc/containers/containers.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "f25228df7b38eaff9b4f63b1b39baa1c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 94, "src": "/root/.ansible/tmp/ansible-tmp-1658956677.5334766-90347-191018229851165/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user container config file] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:32 Wednesday 27 July 2022 21:17:58 +0000 (0:00:00.749) 0:00:15.342 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Wednesday 27 July 2022 21:17:58 +0000 (0:00:00.022) 0:00:15.365 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - system] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:3 Wednesday 27 July 2022 21:17:58 +0000 (0:00:00.034) 0:00:15.399 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/registries.conf.d", "secontext": "system_u:object_r:etc_t:s0", "size": 38, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure registries.d exists - user] **** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:12 Wednesday 27 July 2022 21:17:58 +0000 (0:00:00.426) 0:00:15.826 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update system registries config file] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:21 Wednesday 27 July 2022 21:17:58 +0000 (0:00:00.021) 0:00:15.847 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "checksum": "15062ec12da5e642a2b0fb64c5e03d43b80d9cf0", "dest": "/etc/containers/registries.conf.d/50-systemroles.conf", "gid": 0, "group": "root", "md5sum": "88be21c8634b01869b9f694831b84c1d", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 22, "src": "/root/.ansible/tmp/ansible-tmp-1658956678.7851908-90397-186431747225126/source", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Update non-root user registries config file] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:32 Wednesday 27 July 2022 21:17:59 +0000 (0:00:00.710) 0:00:16.558 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:17 Wednesday 27 July 2022 21:17:59 +0000 (0:00:00.022) 0:00:16.581 ******** TASK [fedora.linux_system_roles.firewall : include_tasks] ********************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:1 Wednesday 27 July 2022 21:17:59 +0000 (0:00:00.063) 0:00:16.644 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.firewall : Ensure ansible_facts used by role] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:2 Wednesday 27 July 2022 21:17:59 +0000 (0:00:00.031) 0:00:16.676 ******** ok: [/cache/fedora-35.qcow2.snap] TASK [fedora.linux_system_roles.firewall : Install firewalld] ****************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 Wednesday 27 July 2022 21:18:00 +0000 (0:00:00.537) 0:00:17.213 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "rc": 0, "results": [ "Installed: ipset-7.15-1.fc35.x86_64", "Installed: iptables-nft-1.8.7-13.fc35.x86_64", "Installed: ipset-libs-7.15-1.fc35.x86_64", "Installed: firewalld-1.0.5-2.fc35.noarch", "Installed: python3-gobject-base-3.42.0-1.fc35.x86_64", "Installed: firewalld-filesystem-1.0.5-2.fc35.noarch", "Installed: python3-nftables-1:1.0.0-1.fc35.x86_64", "Installed: libcap-ng-python3-0.8.2-8.fc35.x86_64", "Installed: gobject-introspection-1.70.0-1.fc35.x86_64", "Installed: python3-firewall-1.0.5-2.fc35.noarch" ] } TASK [fedora.linux_system_roles.firewall : Install python-firewall] ************ task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:12 Wednesday 27 July 2022 21:18:04 +0000 (0:00:04.130) 0:00:21.344 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Install python3-firewall] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 Wednesday 27 July 2022 21:18:04 +0000 (0:00:00.027) 0:00:21.372 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.firewall : Enable and start firewalld service] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 Wednesday 27 July 2022 21:18:06 +0000 (0:00:01.844) 0:00:23.217 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "enabled": true, "name": "firewalld", "state": "started", "status": { "ActiveEnterTimestamp": "n/a", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestamp": "n/a", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "sysinit.target basic.target dbus.socket system.slice dbus-broker.service polkit.service", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestamp": "n/a", "AssertTimestampMonotonic": "0", "Before": "shutdown.target network-pre.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "org.fedoraproject.FirewallD1", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestamp": "n/a", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "ipset.service nftables.service shutdown.target ebtables.service ip6tables.service iptables.service", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "firewalld - dynamic firewall daemon", "DevicePolicy": "auto", "Documentation": "\"man:firewalld(1)\"", "DynamicUser": "no", "EnvironmentFiles": "/etc/sysconfig/firewalld (ignore_errors=yes)", "ExecMainCode": "0", "ExecMainExitTimestamp": "n/a", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestamp": "n/a", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/bin/kill ; argv[]=/bin/kill -HUP $MAINPID ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/firewalld ; argv[]=/usr/sbin/firewalld --nofork --nopid $FIREWALLD_ARGS ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/firewalld.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "firewalld.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "n/a", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "n/a", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "7730", "LimitNPROCSoft": "7730", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "7730", "LimitSIGPENDINGSoft": "7730", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "firewalld.service dbus-org.fedoraproject.FirewallD1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.socket system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "null", "StandardInput": "null", "StandardOutput": "null", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "n/a", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "2319", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestamp": "n/a", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.firewall : Check if previous replaced is defined] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:9 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.970) 0:00:24.187 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__firewall_previous_replaced": false, "__firewall_python_cmd": "/usr/bin/python3" }, "changed": false } TASK [fedora.linux_system_roles.firewall : Get config files, checksums before and remove] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:14 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.048) 0:00:24.235 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Configure firewall] ***************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:36 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.032) 0:00:24.268 ******** changed: [/cache/fedora-35.qcow2.snap] => (item={'port': '8080-8082/tcp', 'state': 'enabled'}) => { "__firewall_changed": true, "ansible_loop_var": "item", "changed": true, "item": { "port": "8080-8082/tcp", "state": "enabled" } } TASK [fedora.linux_system_roles.firewall : Get config files, checksums after] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:66 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.701) 0:00:24.969 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Calculate what has changed] ********* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:74 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.034) 0:00:25.003 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.firewall : Show diffs] ************************* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:80 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.033) 0:00:25.037 ******** skipping: [/cache/fedora-35.qcow2.snap] => {} META: role_complete for /cache/fedora-35.qcow2.snap TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:24 Wednesday 27 July 2022 21:18:07 +0000 (0:00:00.041) 0:00:25.078 ******** redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.selinux to ansible.posix.selinux redirecting (type: modules) ansible.builtin.seboolean to ansible.posix.seboolean redirecting (type: modules) ansible.builtin.sefcontext to community.general.sefcontext redirecting (type: modules) community.general.sefcontext to community.general.system.sefcontext redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport redirecting (type: modules) ansible.builtin.selogin to community.general.selogin redirecting (type: modules) community.general.selogin to community.general.system.selogin TASK [fedora.linux_system_roles.selinux : Set ansible_facts required by role and install packages] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:2 Wednesday 27 July 2022 21:18:08 +0000 (0:00:00.122) 0:00:25.200 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.selinux : Ensure ansible_facts used by role] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:2 Wednesday 27 July 2022 21:18:08 +0000 (0:00:00.037) 0:00:25.238 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python2 tools] ******* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:7 Wednesday 27 July 2022 21:18:08 +0000 (0:00:00.035) 0:00:25.274 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Install SELinux python3 tools] ******* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 Wednesday 27 July 2022 21:18:08 +0000 (0:00:00.026) 0:00:25.300 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.selinux : refresh facts] *********************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 Wednesday 27 July 2022 21:18:10 +0000 (0:00:01.923) 0:00:27.223 ******** ok: [/cache/fedora-35.qcow2.snap] TASK [fedora.linux_system_roles.selinux : Install SELinux tool semanage] ******* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 Wednesday 27 July 2022 21:18:10 +0000 (0:00:00.792) 0:00:28.015 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "rc": 0, "results": [ "Installed: policycoreutils-python-utils-3.3-1.fc35.noarch" ] } TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if enabled] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:5 Wednesday 27 July 2022 21:18:13 +0000 (0:00:02.428) 0:00:30.444 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set permanent SELinux state if disabled] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:12 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.028) 0:00:30.473 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set selinux_reboot_required] ********* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:19 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.023) 0:00:30.496 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "selinux_reboot_required": false }, "changed": false } TASK [fedora.linux_system_roles.selinux : Fail if reboot is required] ********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:23 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.046) 0:00:30.543 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Warn if SELinux is disabled] ********* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:28 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.022) 0:00:30.566 ******** skipping: [/cache/fedora-35.qcow2.snap] => {} TASK [fedora.linux_system_roles.selinux : Drop all local modifications] ******** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:33 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.021) 0:00:30.587 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux boolean local modifications] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:40 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.032) 0:00:30.620 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux file context local modifications] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:44 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.032) 0:00:30.652 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux port local modifications] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:48 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.031) 0:00:30.684 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Purge all SELinux login local modifications] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:52 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.031) 0:00:30.716 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.selinux : Set SELinux booleans] **************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:56 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.034) 0:00:30.750 ******** TASK [fedora.linux_system_roles.selinux : Set SELinux file contexts] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:63 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.020) 0:00:30.771 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:72 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.019) 0:00:30.791 ******** TASK [fedora.linux_system_roles.selinux : Restore SELinux labels on filesystem tree in check mode] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:78 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.025) 0:00:30.816 ******** TASK [fedora.linux_system_roles.selinux : Set an SELinux label on a port] ****** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 Wednesday 27 July 2022 21:18:13 +0000 (0:00:00.022) 0:00:30.838 ******** redirecting (type: modules) ansible.builtin.seport to community.general.seport redirecting (type: modules) community.general.seport to community.general.system.seport changed: [/cache/fedora-35.qcow2.snap] => (item={'ports': '8080-8082', 'setype': 'http_port_t'}) => { "ansible_loop_var": "item", "changed": true, "item": { "ports": "8080-8082", "setype": "http_port_t" }, "ports": [ "8080-8082" ], "proto": "tcp", "setype": "http_port_t", "state": "present" } TASK [fedora.linux_system_roles.selinux : Set linux user to SELinux user mapping] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:94 Wednesday 27 July 2022 21:18:15 +0000 (0:00:01.638) 0:00:32.477 ******** TASK [fedora.linux_system_roles.selinux : Get SELinux modules facts] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:104 Wednesday 27 July 2022 21:18:15 +0000 (0:00:00.023) 0:00:32.500 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "selinux_installed_modules": { "abrt": { "100": "enabled" }, "accountsd": { "100": "enabled" }, "acct": { "100": "enabled" }, "afs": { "100": "enabled" }, "aiccu": { "100": "enabled" }, "aide": { "100": "enabled" }, "ajaxterm": { "100": "enabled" }, "alsa": { "100": "enabled" }, "amanda": { "100": "enabled" }, "amtu": { "100": "enabled" }, "anaconda": { "100": "enabled" }, "antivirus": { "100": "enabled" }, "apache": { "100": "enabled" }, "apcupsd": { "100": "enabled" }, "apm": { "100": "enabled" }, "application": { "100": "enabled" }, "arpwatch": { "100": "enabled" }, "asterisk": { "100": "enabled" }, "auditadm": { "100": "enabled" }, "authconfig": { "100": "enabled" }, "authlogin": { "100": "enabled" }, "automount": { "100": "enabled" }, "avahi": { "100": "enabled" }, "awstats": { "100": "enabled" }, "bacula": { "100": "enabled" }, "base": { "100": "enabled" }, "bcfg2": { "100": "enabled" }, "bind": { "100": "enabled" }, "bitlbee": { "100": "enabled" }, "blkmapd": { "100": "enabled" }, "blueman": { "100": "enabled" }, "bluetooth": { "100": "enabled" }, "boinc": { "100": "enabled" }, "boltd": { "100": "enabled" }, "bootloader": { "100": "enabled" }, "brctl": { "100": "enabled" }, "brltty": { "100": "enabled" }, "bugzilla": { "100": "enabled" }, "bumblebee": { "100": "enabled" }, "cachefilesd": { "100": "enabled" }, "calamaris": { "100": "enabled" }, "callweaver": { "100": "enabled" }, "canna": { "100": "enabled" }, "ccs": { "100": "enabled" }, "cdrecord": { "100": "enabled" }, "certmaster": { "100": "enabled" }, "certmonger": { "100": "enabled" }, "certwatch": { "100": "enabled" }, "cfengine": { "100": "enabled" }, "cgroup": { "100": "enabled" }, "chrome": { "100": "enabled" }, "chronyd": { "100": "enabled" }, "cinder": { "100": "enabled" }, "cipe": { "100": "enabled" }, "clock": { "100": "enabled" }, "clogd": { "100": "enabled" }, "cloudform": { "100": "enabled" }, "cmirrord": { "100": "enabled" }, "cobbler": { "100": "enabled" }, "cockpit": { "100": "enabled" }, "collectd": { "100": "enabled" }, "colord": { "100": "enabled" }, "comsat": { "100": "enabled" }, "condor": { "100": "enabled" }, "conman": { "100": "enabled" }, "conntrackd": { "100": "enabled" }, "consolekit": { "100": "enabled" }, "container": { "200": "enabled" }, "couchdb": { "100": "enabled" }, "courier": { "100": "enabled" }, "cpucontrol": { "100": "enabled" }, "cpufreqselector": { "100": "enabled" }, "cpuplug": { "100": "enabled" }, "cron": { "100": "enabled" }, "ctdb": { "100": "enabled" }, "cups": { "100": "enabled" }, "cvs": { "100": "enabled" }, "cyphesis": { "100": "enabled" }, "cyrus": { "100": "enabled" }, "daemontools": { "100": "enabled" }, "dbadm": { "100": "enabled" }, "dbskk": { "100": "enabled" }, "dbus": { "100": "enabled" }, "dcc": { "100": "enabled" }, "ddclient": { "100": "enabled" }, "denyhosts": { "100": "enabled" }, "devicekit": { "100": "enabled" }, "dhcp": { "100": "enabled" }, "dictd": { "100": "enabled" }, "dirsrv": { "100": "enabled" }, "dirsrv-admin": { "100": "enabled" }, "dmesg": { "100": "enabled" }, "dmidecode": { "100": "enabled" }, "dnsmasq": { "100": "enabled" }, "dnssec": { "100": "enabled" }, "dovecot": { "100": "enabled" }, "drbd": { "100": "enabled" }, "dspam": { "100": "enabled" }, "entropyd": { "100": "enabled" }, "exim": { "100": "enabled" }, "fail2ban": { "100": "enabled" }, "fcoe": { "100": "enabled" }, "fedoratp": { "100": "enabled" }, "fetchmail": { "100": "enabled" }, "finger": { "100": "enabled" }, "firewalld": { "100": "enabled" }, "firewallgui": { "100": "enabled" }, "firstboot": { "100": "enabled" }, "fprintd": { "100": "enabled" }, "freeipmi": { "100": "enabled" }, "freqset": { "100": "enabled" }, "fstools": { "100": "enabled" }, "ftp": { "100": "enabled" }, "fwupd": { "100": "enabled" }, "games": { "100": "enabled" }, "gdomap": { "100": "enabled" }, "geoclue": { "100": "enabled" }, "getty": { "100": "enabled" }, "git": { "100": "enabled" }, "gitosis": { "100": "enabled" }, "glance": { "100": "enabled" }, "glusterd": { "100": "enabled" }, "gnome": { "100": "enabled" }, "gpg": { "100": "enabled" }, "gpm": { "100": "enabled" }, "gpsd": { "100": "enabled" }, "gssproxy": { "100": "enabled" }, "guest": { "100": "enabled" }, "hddtemp": { "100": "enabled" }, "hostapd": { "100": "enabled" }, "hostname": { "100": "enabled" }, "hsqldb": { "100": "enabled" }, "hwloc": { "100": "enabled" }, "hypervkvp": { "100": "enabled" }, "ibacm": { "100": "enabled" }, "ica": { "100": "enabled" }, "icecast": { "100": "enabled" }, "inetd": { "100": "enabled" }, "init": { "100": "enabled" }, "inn": { "100": "enabled" }, "iodine": { "100": "enabled" }, "iotop": { "100": "enabled" }, "ipa": { "100": "enabled" }, "ipmievd": { "100": "enabled" }, "ipsec": { "100": "enabled" }, "iptables": { "100": "enabled" }, "irc": { "100": "enabled" }, "irqbalance": { "100": "enabled" }, "iscsi": { "100": "enabled" }, "isns": { "100": "enabled" }, "jabber": { "100": "enabled" }, "jetty": { "100": "enabled" }, "jockey": { "100": "enabled" }, "journalctl": { "100": "enabled" }, "kdump": { "100": "enabled" }, "kdumpgui": { "100": "enabled" }, "keepalived": { "100": "enabled" }, "kerberos": { "100": "enabled" }, "keyboardd": { "100": "enabled" }, "keystone": { "100": "enabled" }, "kismet": { "100": "enabled" }, "kmscon": { "100": "enabled" }, "kpatch": { "100": "enabled" }, "ksmtuned": { "100": "enabled" }, "ktalk": { "100": "enabled" }, "l2tp": { "100": "enabled" }, "ldap": { "100": "enabled" }, "libraries": { "100": "enabled" }, "likewise": { "100": "enabled" }, "linuxptp": { "100": "enabled" }, "lircd": { "100": "enabled" }, "livecd": { "100": "enabled" }, "lldpad": { "100": "enabled" }, "loadkeys": { "100": "enabled" }, "locallogin": { "100": "enabled" }, "lockdev": { "100": "enabled" }, "logadm": { "100": "enabled" }, "logging": { "100": "enabled" }, "logrotate": { "100": "enabled" }, "logwatch": { "100": "enabled" }, "lpd": { "100": "enabled" }, "lsm": { "100": "enabled" }, "lttng-tools": { "100": "enabled" }, "lvm": { "100": "enabled" }, "mailman": { "100": "enabled" }, "mailscanner": { "100": "enabled" }, "man2html": { "100": "enabled" }, "mandb": { "100": "enabled" }, "mcelog": { "100": "enabled" }, "mediawiki": { "100": "enabled" }, "memcached": { "100": "enabled" }, "milter": { "100": "enabled" }, "minidlna": { "100": "enabled" }, "minissdpd": { "100": "enabled" }, "mip6d": { "100": "enabled" }, "mirrormanager": { "100": "enabled" }, "miscfiles": { "100": "enabled" }, "mock": { "100": "enabled" }, "modemmanager": { "100": "enabled" }, "modutils": { "100": "enabled" }, "mojomojo": { "100": "enabled" }, "mon_statd": { "100": "enabled" }, "mongodb": { "100": "enabled" }, "motion": { "100": "enabled" }, "mount": { "100": "enabled" }, "mozilla": { "100": "enabled" }, "mpd": { "100": "enabled" }, "mplayer": { "100": "enabled" }, "mrtg": { "100": "enabled" }, "mta": { "100": "enabled" }, "munin": { "100": "enabled" }, "mysql": { "100": "enabled" }, "mythtv": { "100": "enabled" }, "naemon": { "100": "enabled" }, "nagios": { "100": "enabled" }, "namespace": { "100": "enabled" }, "ncftool": { "100": "enabled" }, "netlabel": { "100": "enabled" }, "netutils": { "100": "enabled" }, "networkmanager": { "100": "enabled" }, "ninfod": { "100": "enabled" }, "nis": { "100": "enabled" }, "nova": { "100": "enabled" }, "nscd": { "100": "enabled" }, "nsd": { "100": "enabled" }, "nslcd": { "100": "enabled" }, "ntop": { "100": "enabled" }, "ntp": { "100": "enabled" }, "numad": { "100": "enabled" }, "nut": { "100": "enabled" }, "nx": { "100": "enabled" }, "obex": { "100": "enabled" }, "oddjob": { "100": "enabled" }, "opafm": { "100": "enabled" }, "openct": { "100": "enabled" }, "opendnssec": { "100": "enabled" }, "openfortivpn": { "100": "enabled" }, "openhpid": { "100": "enabled" }, "openshift": { "100": "enabled" }, "openshift-origin": { "100": "enabled" }, "opensm": { "100": "enabled" }, "openvpn": { "100": "enabled" }, "openvswitch": { "100": "enabled" }, "openwsman": { "100": "enabled" }, "oracleasm": { "100": "enabled" }, "osad": { "100": "enabled" }, "pads": { "100": "enabled" }, "passenger": { "100": "enabled" }, "pcmcia": { "100": "enabled" }, "pcp": { "100": "enabled" }, "pcscd": { "100": "enabled" }, "pdns": { "100": "enabled" }, "pegasus": { "100": "enabled" }, "permissivedomains": { "100": "enabled" }, "pesign": { "100": "enabled" }, "pingd": { "100": "enabled" }, "piranha": { "100": "enabled" }, "pkcs": { "100": "enabled" }, "pkcs11proxyd": { "100": "enabled" }, "pki": { "100": "enabled" }, "plymouthd": { "100": "enabled" }, "podsleuth": { "100": "enabled" }, "policykit": { "100": "enabled" }, "polipo": { "100": "enabled" }, "portmap": { "100": "enabled" }, "portreserve": { "100": "enabled" }, "postfix": { "100": "enabled" }, "postgresql": { "100": "enabled" }, "postgrey": { "100": "enabled" }, "ppp": { "100": "enabled" }, "prelink": { "100": "enabled" }, "prelude": { "100": "enabled" }, "privoxy": { "100": "enabled" }, "procmail": { "100": "enabled" }, "prosody": { "100": "enabled" }, "psad": { "100": "enabled" }, "ptchown": { "100": "enabled" }, "publicfile": { "100": "enabled" }, "pulseaudio": { "100": "enabled" }, "puppet": { "100": "enabled" }, "pwauth": { "100": "enabled" }, "qmail": { "100": "enabled" }, "qpid": { "100": "enabled" }, "quantum": { "100": "enabled" }, "quota": { "100": "enabled" }, "rabbitmq": { "100": "enabled" }, "radius": { "100": "enabled" }, "radvd": { "100": "enabled" }, "raid": { "100": "enabled" }, "rasdaemon": { "100": "enabled" }, "rdisc": { "100": "enabled" }, "readahead": { "100": "enabled" }, "realmd": { "100": "enabled" }, "redis": { "100": "enabled" }, "remotelogin": { "100": "enabled" }, "rhcs": { "100": "enabled" }, "rhev": { "100": "enabled" }, "rhgb": { "100": "enabled" }, "rhnsd": { "100": "enabled" }, "rhsmcertd": { "100": "enabled" }, "ricci": { "100": "enabled" }, "rkhunter": { "100": "enabled" }, "rkt": { "100": "enabled" }, "rlogin": { "100": "enabled" }, "rngd": { "100": "enabled" }, "rolekit": { "100": "enabled" }, "roundup": { "100": "enabled" }, "rpc": { "100": "enabled" }, "rpcbind": { "100": "enabled" }, "rpm": { "100": "enabled" }, "rrdcached": { "100": "enabled" }, "rshd": { "100": "enabled" }, "rssh": { "100": "enabled" }, "rsync": { "100": "enabled" }, "rtas": { "100": "enabled" }, "rtkit": { "100": "enabled" }, "rwho": { "100": "enabled" }, "samba": { "100": "enabled" }, "sambagui": { "100": "enabled" }, "sandboxX": { "100": "enabled" }, "sanlock": { "100": "enabled" }, "sasl": { "100": "enabled" }, "sbd": { "100": "enabled" }, "sblim": { "100": "enabled" }, "screen": { "100": "enabled" }, "secadm": { "100": "enabled" }, "sectoolm": { "100": "enabled" }, "selinuxutil": { "100": "enabled" }, "sendmail": { "100": "enabled" }, "sensord": { "100": "enabled" }, "setrans": { "100": "enabled" }, "setroubleshoot": { "100": "enabled" }, "seunshare": { "100": "enabled" }, "sge": { "100": "enabled" }, "shorewall": { "100": "enabled" }, "slocate": { "100": "enabled" }, "slpd": { "100": "enabled" }, "smartmon": { "100": "enabled" }, "smokeping": { "100": "enabled" }, "smoltclient": { "100": "enabled" }, "smsd": { "100": "enabled" }, "snapper": { "100": "enabled" }, "snmp": { "100": "enabled" }, "snort": { "100": "enabled" }, "sosreport": { "100": "enabled" }, "soundserver": { "100": "enabled" }, "spamassassin": { "100": "enabled" }, "speech-dispatcher": { "100": "enabled" }, "squid": { "100": "enabled" }, "ssh": { "100": "enabled" }, "sslh": { "100": "enabled" }, "sssd": { "100": "enabled" }, "staff": { "100": "enabled" }, "stapserver": { "100": "enabled" }, "stratisd": { "100": "enabled" }, "stunnel": { "100": "enabled" }, "su": { "100": "enabled" }, "sudo": { "100": "enabled" }, "svnserve": { "100": "enabled" }, "swift": { "100": "enabled" }, "sysadm": { "100": "enabled" }, "sysadm_secadm": { "100": "enabled" }, "sysnetwork": { "100": "enabled" }, "sysstat": { "100": "enabled" }, "systemd": { "100": "enabled" }, "tangd": { "100": "enabled" }, "targetd": { "100": "enabled" }, "tcpd": { "100": "enabled" }, "tcsd": { "100": "enabled" }, "telepathy": { "100": "enabled" }, "telnet": { "100": "enabled" }, "tftp": { "100": "enabled" }, "tgtd": { "100": "enabled" }, "thin": { "100": "enabled" }, "thumb": { "100": "enabled" }, "timedatex": { "100": "enabled" }, "tlp": { "100": "enabled" }, "tmpreaper": { "100": "enabled" }, "tomcat": { "100": "enabled" }, "tor": { "100": "enabled" }, "tuned": { "100": "enabled" }, "tvtime": { "100": "enabled" }, "udev": { "100": "enabled" }, "ulogd": { "100": "enabled" }, "uml": { "100": "enabled" }, "unconfined": { "100": "enabled" }, "unconfineduser": { "100": "enabled" }, "unlabelednet": { "100": "enabled" }, "unprivuser": { "100": "enabled" }, "updfstab": { "100": "enabled" }, "usbmodules": { "100": "enabled" }, "usbmuxd": { "100": "enabled" }, "userdomain": { "100": "enabled" }, "userhelper": { "100": "enabled" }, "usermanage": { "100": "enabled" }, "usernetctl": { "100": "enabled" }, "uucp": { "100": "enabled" }, "uuidd": { "100": "enabled" }, "varnishd": { "100": "enabled" }, "vdagent": { "100": "enabled" }, "vhostmd": { "100": "enabled" }, "virt": { "100": "enabled" }, "vlock": { "100": "enabled" }, "vmtools": { "100": "enabled" }, "vmware": { "100": "enabled" }, "vnstatd": { "100": "enabled" }, "vpn": { "100": "enabled" }, "w3c": { "100": "enabled" }, "watchdog": { "100": "enabled" }, "wdmd": { "100": "enabled" }, "webadm": { "100": "enabled" }, "webalizer": { "100": "enabled" }, "wine": { "100": "enabled" }, "wireshark": { "100": "enabled" }, "xen": { "100": "enabled" }, "xguest": { "100": "enabled" }, "xserver": { "100": "enabled" }, "zabbix": { "100": "enabled" }, "zarafa": { "100": "enabled" }, "zebra": { "100": "enabled" }, "zoneminder": { "100": "enabled" }, "zosremote": { "100": "enabled" } }, "selinux_priorities": true }, "changed": false } TASK [fedora.linux_system_roles.selinux : include_tasks] *********************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:107 Wednesday 27 July 2022 21:18:15 +0000 (0:00:00.590) 0:00:33.091 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } META: role_complete for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:31 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.053) 0:00:33.145 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/fedora-35.qcow2.snap => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'run_as_user': 'user1', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd1'}, 'spec': {'containers': [{'name': 'httpd1', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8080}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd1'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd1-create'}}]}}}) included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/fedora-35.qcow2.snap => (item={'state': 'started', 'debug': True, 'log_level': 'debug', 'kube_file_content': {'apiVersion': 'v1', 'kind': 'Pod', 'metadata': {'labels': {'app': 'test', 'io.containers.autoupdate': 'registry'}, 'name': 'httpd2'}, 'spec': {'containers': [{'name': 'httpd2', 'image': 'quay.io/libpod/testimage:20210610', 'command': ['/bin/busybox-extras', 'httpd', '-f', '-p', 80], 'ports': [{'containerPort': 80, 'hostPort': 8081}], 'volumeMounts': [{'mountPath': '/var/www:Z', 'name': 'www'}, {'mountPath': '/var/httpd-create:Z', 'name': 'create'}], 'workingDir': '/var/www'}], 'volumes': [{'name': 'www', 'hostPath': {'path': '/tmp/httpd2'}}, {'name': 'create', 'hostPath': {'path': '/tmp/httpd2-create'}}]}}}) included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml for /cache/fedora-35.qcow2.snap => (item={'state': 'started', 'kube_file_src': '/tmp/lsr_podman_v5rizl4i.yml'}) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:13 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.097) 0:00:33.242 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_kube_spec": { "debug": true, "log_level": "debug", "state": "started" }, "__podman_kube_str": "apiVersion: v1\nkind: Pod\nmetadata:\n labels:\n app: test\n io.containers.autoupdate: registry\n name: httpd1\nspec:\n containers:\n - command:\n - /bin/busybox-extras\n - httpd\n - -f\n - -p\n - 80\n image: quay.io/libpod/testimage:20210610\n name: httpd1\n ports:\n - containerPort: 80\n hostPort: 8080\n volumeMounts:\n - mountPath: /var/www:Z\n name: www\n - mountPath: /var/httpd-create:Z\n name: create\n workingDir: /var/www\n volumes:\n - hostPath:\n path: /tmp/httpd1\n name: www\n - hostPath:\n path: /tmp/httpd1-create\n name: create\n" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:25 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.056) 0:00:33.299 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_kube": { "apiVersion": "v1", "kind": "Pod", "metadata": { "labels": { "app": "test", "io.containers.autoupdate": "registry" }, "name": "httpd1" }, "spec": { "containers": [ { "command": [ "/bin/busybox-extras", "httpd", "-f", "-p", 80 ], "image": "quay.io/libpod/testimage:20210610", "name": "httpd1", "ports": [ { "containerPort": 80, "hostPort": 8080 } ], "volumeMounts": [ { "mountPath": "/var/www:Z", "name": "www" }, { "mountPath": "/var/httpd-create:Z", "name": "create" } ], "workingDir": "/var/www" } ], "volumes": [ { "hostPath": { "path": "/tmp/httpd1" }, "name": "www" }, { "hostPath": { "path": "/tmp/httpd1-create" }, "name": "create" } ] } }, "__podman_kube_file": "", "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "user1" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:35 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.056) 0:00:33.356 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_kube_name": "httpd1", "__podman_rootless": true }, "changed": false } TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:40 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.038) 0:00:33.395 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "getent_passwd": { "user1": [ "x", "1001", "1001", "", "/home/user1", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:46 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.486) 0:00:33.881 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if no kube spec is given] ******** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:53 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.027) 0:00:33.909 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:62 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.026) 0:00:33.935 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_group": "1001", "__podman_systemd_scope": "user", "__podman_user_home_dir": "/home/user1", "__podman_xdg_runtime_dir": "/run/user/1001" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:78 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.066) 0:00:34.002 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_kube_path": "/home/user1/.config/containers/ansible-kubernetes.d" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:82 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.040) 0:00:34.042 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_kube_file": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get service name using systemd-escape] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:86 Wednesday 27 July 2022 21:18:16 +0000 (0:00:00.047) 0:00:34.090 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "cmd": [ "systemd-escape", "--template", "podman-kube@.service", "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml" ], "delta": "0:00:00.009175", "end": "2022-07-27 21:18:16.969698", "rc": 0, "start": "2022-07-27 21:18:16.960523" } STDOUT: podman-kube@-home-user1-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service TASK [fedora.linux_system_roles.podman : Cleanup containers and services] ****** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:93 Wednesday 27 July 2022 21:18:17 +0000 (0:00:00.421) 0:00:34.512 ******** skipping: [/cache/fedora-35.qcow2.snap] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update containers and services] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_kube_spec.yml:97 Wednesday 27 July 2022 21:18:17 +0000 (0:00:00.027) 0:00:34.539 ******** included: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml for /cache/fedora-35.qcow2.snap TASK [fedora.linux_system_roles.podman : Check if user is lingering] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:1 Wednesday 27 July 2022 21:18:17 +0000 (0:00:00.059) 0:00:34.599 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Enable lingering if needed] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:7 Wednesday 27 July 2022 21:18:17 +0000 (0:00:00.438) 0:00:35.037 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "cmd": [ "loginctl", "enable-linger", "user1" ], "delta": "0:00:00.025653", "end": "2022-07-27 21:18:17.922080", "rc": 0, "start": "2022-07-27 21:18:17.896427" } TASK [fedora.linux_system_roles.podman : Get the host mount volumes] *********** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:13 Wednesday 27 July 2022 21:18:18 +0000 (0:00:00.428) 0:00:35.466 ******** ok: [/cache/fedora-35.qcow2.snap] => { "ansible_facts": { "__podman_volumes": [ "/tmp/httpd1", "/tmp/httpd1-create" ] }, "changed": false } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24 Wednesday 27 July 2022 21:18:18 +0000 (0:00:00.061) 0:00:35.527 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) changed: [/cache/fedora-35.qcow2.snap] => (item=/tmp/httpd1) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 60, "state": "directory", "uid": 1001 } changed: [/cache/fedora-35.qcow2.snap] => (item=/tmp/httpd1-create) => { "ansible_loop_var": "item", "changed": true, "gid": 1001, "group": "user1", "item": "/tmp/httpd1-create", "mode": "0644", "owner": "user1", "path": "/tmp/httpd1-create", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 40, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 Wednesday 27 July 2022 21:18:19 +0000 (0:00:00.807) 0:00:36.335 ******** changed: [/cache/fedora-35.qcow2.snap] => (item=quay.io/libpod/testimage:20210610) => { "actions": [ "Pulled image quay.io/libpod/testimage:20210610" ], "ansible_loop_var": "item", "changed": true, "image": [ { "Annotations": {}, "Architecture": "amd64", "Author": "", "Comment": "", "Config": { "Cmd": [ "/bin/echo", "This container is intended for podman CI testing" ], "Env": [ "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" ], "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "WorkingDir": "/home/podman" }, "Created": "2021-06-10T18:55:43.049643585Z", "Digest": "sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "GraphDriver": { "Data": { "UpperDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/diff", "WorkDir": "/home/user1/.local/share/containers/storage/overlay/f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e/work" }, "Name": "overlay" }, "History": [ { "created": "2021-06-10T18:55:42.831917915Z", "created_by": "/bin/sh -c apk add busybox-extras", "empty_layer": true }, { "created": "2021-06-10T18:55:43.005956291Z", "created_by": "/bin/sh -c #(nop) ADD multi:0ed825786ec12498034356148303d2e6dfd4698131f4b5d4599e5eafa2ab71bd in /home/podman/ ", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006000972Z", "created_by": "/bin/sh -c #(nop) LABEL created_by=test/system/build-testimage", "empty_layer": true }, { "created": "2021-06-10T18:55:43.006019818Z", "created_by": "/bin/sh -c #(nop) LABEL created_at=2021-06-10T18:55:36Z", "empty_layer": true }, { "created": "2021-06-10T18:55:43.028748885Z", "created_by": "/bin/sh -c #(nop) WORKDIR /home/podman", "empty_layer": true }, { "comment": "FROM docker.io/amd64/alpine:3.13.5", "created": "2021-06-10T18:55:43.160651456Z", "created_by": "/bin/sh -c #(nop) CMD [\"/bin/echo\", \"This container is intended for podman CI testing\"]" } ], "Id": "9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f", "Labels": { "created_at": "2021-06-10T18:55:36Z", "created_by": "test/system/build-testimage", "io.buildah.version": "1.21.0" }, "ManifestType": "application/vnd.docker.distribution.manifest.v2+json", "NamesHistory": [ "quay.io/libpod/testimage:20210610" ], "Os": "linux", "Parent": "", "RepoDigests": [ "quay.io/libpod/testimage@sha256:d48f2feaca74863c342cd9ce11edbe208675975740e7f4dd635b7b345339426a", "quay.io/libpod/testimage@sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" ], "RepoTags": [ "quay.io/libpod/testimage:20210610" ], "RootFS": { "Layers": [ "sha256:f36118df491fbfd96093731809941d7bb881136415ccc114bc26d6bf10499a0e" ], "Type": "layers" }, "Size": 7987860, "User": "", "Version": "", "VirtualSize": 7987860 } ], "item": "quay.io/libpod/testimage:20210610", "podman_actions": [ "/usr/bin/podman image ls quay.io/libpod/testimage:20210610 --format json", "/usr/bin/podman pull quay.io/libpod/testimage:20210610 -q", "/usr/bin/podman inspect 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f --format json" ], "warnings": [ "Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually" ] } [WARNING]: Module remote_tmp /home/user1/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually TASK [fedora.linux_system_roles.podman : Check the kubernetes yaml file] ******* task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:51 Wednesday 27 July 2022 21:18:21 +0000 (0:00:01.948) 0:00:38.284 ******** ok: [/cache/fedora-35.qcow2.snap] => { "changed": false, "failed_when_result": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Ensure the kubernetes directory is present] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:59 Wednesday 27 July 2022 21:18:21 +0000 (0:00:00.391) 0:00:38.676 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "gid": 1001, "group": "user1", "mode": "0700", "owner": "user1", "path": "/home/user1/.config/containers/ansible-kubernetes.d", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 0, "state": "directory", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Ensure kubernetes yaml files are present] *** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:68 Wednesday 27 July 2022 21:18:21 +0000 (0:00:00.436) 0:00:39.112 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "checksum": "f36f5b3fd8752a059ae217d04f65ba46b054d773", "dest": "/home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml", "gid": 1001, "group": "user1", "md5sum": "47ccac83d9de77d9645ae1ef4733269a", "mode": "0600", "owner": "user1", "secontext": "unconfined_u:object_r:config_home_t:s0", "size": 721, "src": "/root/.ansible/tmp/ansible-tmp-1658956702.0737054-90830-70263233992852/source", "state": "file", "uid": 1001 } TASK [fedora.linux_system_roles.podman : Update containers/pods] *************** task path: /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 Wednesday 27 July 2022 21:18:22 +0000 (0:00:00.740) 0:00:39.853 ******** [WARNING]: Using a variable for a task's 'args' is unsafe in some situations (see https://docs.ansible.com/ansible/devel/reference_appendices/faq.html#argsplat- unsafe) fatal: [/cache/fedora-35.qcow2.snap]: FAILED! => { "changed": false } MSG: Output: [error starting container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7: /usr/bin/crun: symbol lookup error: /usr/bin/crun: undefined symbol: criu_join_ns_add: OCI runtime error] [error starting container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7: /usr/bin/crun: symbol lookup error: /usr/bin/crun: undefined symbol: criu_join_ns_add: OCI runtime error error starting container adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea: a dependency of container adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea failed to start: container state improper] Pod: 70f0cc0b35c030796a0e8ebeab810bfedd230c81ebc45e830bf7d26ad2586b35 Container: adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea Error=time="2022-07-27T21:18:22Z" level=info msg="/usr/bin/podman filtering at log level debug" time="2022-07-27T21:18:22Z" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /home/user1/.config/containers/ansible-kubernetes.d/httpd1.yml)" time="2022-07-27T21:18:22Z" level=debug msg="Merged system config \"/usr/share/containers/containers.conf\"" time="2022-07-27T21:18:22Z" level=debug msg="Merged system config \"/etc/containers/containers.conf.d/50-systemroles.conf\"" time="2022-07-27T21:18:22Z" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2022-07-27T21:18:22Z" level=debug msg="Initializing boltdb state at /home/user1/.local/share/containers/storage/libpod/bolt_state.db" time="2022-07-27T21:18:22Z" level=debug msg="systemd-logind: Unknown object '/'." time="2022-07-27T21:18:22Z" level=debug msg="Using graph driver overlay" time="2022-07-27T21:18:22Z" level=debug msg="Using graph root /home/user1/.local/share/containers/storage" time="2022-07-27T21:18:22Z" level=debug msg="Using run root /run/user/1001/containers" time="2022-07-27T21:18:22Z" level=debug msg="Using static dir /home/user1/.local/share/containers/storage/libpod" time="2022-07-27T21:18:22Z" level=debug msg="Using tmp dir /run/user/1001/libpod/tmp" time="2022-07-27T21:18:22Z" level=debug msg="Using volume path /home/user1/.local/share/containers/storage/volumes" time="2022-07-27T21:18:22Z" level=debug msg="Set libpod namespace to \"\"" time="2022-07-27T21:18:22Z" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T21:18:22Z" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T21:18:22Z" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T21:18:22Z" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T21:18:22Z" level=debug msg="Cached value indicated that native-diff is usable" time="2022-07-27T21:18:22Z" level=debug msg="backingFs=btrfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T21:18:22Z" level=debug msg="Initializing event backend journald" time="2022-07-27T21:18:22Z" level=debug msg="configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2022-07-27T21:18:22Z" level=debug msg="configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2022-07-27T21:18:22Z" level=debug msg="configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2022-07-27T21:18:22Z" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2022-07-27T21:18:22Z" level=info msg="Found CNI network podman (type=bridge) at /home/user1/.config/cni/net.d/87-podman.conflist" time="2022-07-27T21:18:22Z" level=debug msg="Default CNI network name podman is unchangeable" time="2022-07-27T21:18:22Z" level=info msg="Setting parallel job count to 13" time="2022-07-27T21:18:22Z" level=debug msg="Pulling image k8s.gcr.io/pause:3.5 (policy: newer)" time="2022-07-27T21:18:22Z" level=debug msg="Looking up image \"k8s.gcr.io/pause:3.5\" in local containers storage" time="2022-07-27T21:18:22Z" level=debug msg="Trying \"k8s.gcr.io/pause:3.5\" ..." time="2022-07-27T21:18:22Z" level=debug msg="Trying \"k8s.gcr.io/pause:3.5\" ..." time="2022-07-27T21:18:22Z" level=debug msg="Trying \"k8s.gcr.io/pause:3.5\" ..." time="2022-07-27T21:18:22Z" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf\"" time="2022-07-27T21:18:22Z" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/000-shortnames.conf\"" time="2022-07-27T21:18:22Z" level=debug msg="Loading registries configuration \"/etc/containers/registries.conf.d/50-systemroles.conf\"" time="2022-07-27T21:18:22Z" level=debug msg="Attempting to pull candidate k8s.gcr.io/pause:3.5 for k8s.gcr.io/pause:3.5" time="2022-07-27T21:18:22Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]k8s.gcr.io/pause:3.5\"" time="2022-07-27T21:18:22Z" level=debug msg="Copying source image //k8s.gcr.io/pause:3.5 to destination image [overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]k8s.gcr.io/pause:3.5" time="2022-07-27T21:18:22Z" level=debug msg="Trying to access \"k8s.gcr.io/pause:3.5\"" time="2022-07-27T21:18:22Z" level=debug msg="No credentials for k8s.gcr.io found" time="2022-07-27T21:18:22Z" level=debug msg="Using registries.d directory /etc/containers/registries.d for sigstore configuration" time="2022-07-27T21:18:22Z" level=debug msg=" Using \"default-docker\" configuration" time="2022-07-27T21:18:22Z" level=debug msg=" No signature storage configuration found for k8s.gcr.io/pause:3.5, using built-in default file:///home/user1/.local/share/containers/sigstore" time="2022-07-27T21:18:22Z" level=debug msg="Looking for TLS certificates and private keys in /etc/docker/certs.d/k8s.gcr.io" time="2022-07-27T21:18:22Z" level=debug msg="GET https://k8s.gcr.io/v2/" time="2022-07-27T21:18:23Z" level=debug msg="Ping https://k8s.gcr.io/v2/ status 401" time="2022-07-27T21:18:23Z" level=debug msg="GET https://k8s.gcr.io/v2/token?scope=repository%3Apause%3Apull&service=k8s.gcr.io" time="2022-07-27T21:18:23Z" level=debug msg="GET https://k8s.gcr.io/v2/pause/manifests/3.5" time="2022-07-27T21:18:23Z" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.list.v2+json\"" time="2022-07-27T21:18:23Z" level=debug msg="Using blob info cache at /home/user1/.local/share/containers/cache/blob-info-cache-v1.boltdb" time="2022-07-27T21:18:23Z" level=debug msg="Source is a manifest list; copying (only) instance sha256:369201a612f7b2b585a8e6ca99f77a36bcdbd032463d815388a96800b63ef2c8 for current system" time="2022-07-27T21:18:23Z" level=debug msg="GET https://k8s.gcr.io/v2/pause/manifests/sha256:369201a612f7b2b585a8e6ca99f77a36bcdbd032463d815388a96800b63ef2c8" time="2022-07-27T21:18:23Z" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.v2+json\"" time="2022-07-27T21:18:23Z" level=debug msg="IsRunningImageAllowed for image docker:k8s.gcr.io/pause:3.5" time="2022-07-27T21:18:23Z" level=debug msg=" Using default policy section" time="2022-07-27T21:18:23Z" level=debug msg=" Requirement 0: allowed" time="2022-07-27T21:18:23Z" level=debug msg="Overall: allowed" time="2022-07-27T21:18:23Z" level=debug msg="Downloading /v2/pause/blobs/sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459" time="2022-07-27T21:18:23Z" level=debug msg="GET https://k8s.gcr.io/v2/pause/blobs/sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459" time="2022-07-27T21:18:23Z" level=debug msg="Reading /home/user1/.local/share/containers/sigstore/pause@sha256=369201a612f7b2b585a8e6ca99f77a36bcdbd032463d815388a96800b63ef2c8/signature-1" time="2022-07-27T21:18:23Z" level=debug msg="Manifest has MIME type application/vnd.docker.distribution.manifest.v2+json, ordered candidate list [application/vnd.docker.distribution.manifest.v2+json, application/vnd.docker.distribution.manifest.v1+prettyjws, application/vnd.oci.image.manifest.v1+json, application/vnd.docker.distribution.manifest.v1+json]" time="2022-07-27T21:18:23Z" level=debug msg="... will first try using the original manifest unmodified" time="2022-07-27T21:18:23Z" level=debug msg="Failed to retrieve partial blob: blob type not supported for partial retrieval" time="2022-07-27T21:18:23Z" level=debug msg="Downloading /v2/pause/blobs/sha256:019d8da33d911d9baabe58ad63dea2107ed15115cca0fc27fc0f627e82a695c1" time="2022-07-27T21:18:23Z" level=debug msg="GET https://k8s.gcr.io/v2/pause/blobs/sha256:019d8da33d911d9baabe58ad63dea2107ed15115cca0fc27fc0f627e82a695c1" time="2022-07-27T21:18:23Z" level=debug msg="Detected compression format gzip" time="2022-07-27T21:18:23Z" level=debug msg="Using original blob without modification" time="2022-07-27T21:18:23Z" level=debug msg="Applying tar in /home/user1/.local/share/containers/storage/overlay/dee215ffc666313e1381d3e6e4299a4455503735b8df31c3fa161d2df50860a8/diff" time="2022-07-27T21:18:23Z" level=debug msg="No compression detected" time="2022-07-27T21:18:23Z" level=debug msg="Using original blob without modification" time="2022-07-27T21:18:23Z" level=debug msg="setting image creation date to 2021-03-16 13:16:57.822648569 +0000 UTC" time="2022-07-27T21:18:23Z" level=debug msg="created new image ID \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="saved image metadata \"{\\\"signatures-sizes\\\":{\\\"sha256:369201a612f7b2b585a8e6ca99f77a36bcdbd032463d815388a96800b63ef2c8\\\":[]}}\"" time="2022-07-27T21:18:23Z" level=debug msg="set names of image \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" to [k8s.gcr.io/pause:3.5]" time="2022-07-27T21:18:23Z" level=debug msg="Pulled candidate k8s.gcr.io/pause:3.5 successfully" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" as \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" as \"ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459)" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"k8s.gcr.io/pause:3.5\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"k8s.gcr.io/pause:3.5\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"k8s.gcr.io/pause:3.5\" as \"k8s.gcr.io/pause:3.5\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"k8s.gcr.io/pause:3.5\" as \"k8s.gcr.io/pause:3.5\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459)" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"k8s.gcr.io/pause:3.5\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"k8s.gcr.io/pause:3.5\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"k8s.gcr.io/pause:3.5\" as \"k8s.gcr.io/pause:3.5\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"k8s.gcr.io/pause:3.5\" as \"k8s.gcr.io/pause:3.5\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459)" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459" time="2022-07-27T21:18:23Z" level=debug msg="using systemd mode: false" time="2022-07-27T21:18:23Z" level=debug msg="Adding exposed ports" time="2022-07-27T21:18:23Z" level=debug msg="setting container name 70f0cc0b35c0-infra" time="2022-07-27T21:18:23Z" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2022-07-27T21:18:23Z" level=debug msg="Allocated lock 1 for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:ed210e3e4a5bae1237f1bb44d72a05a2f1e5c6bfe7a7e73da179e2534269c459\"" time="2022-07-27T21:18:23Z" level=debug msg="created container \"f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7\"" time="2022-07-27T21:18:23Z" level=debug msg="container \"f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata\"" time="2022-07-27T21:18:23Z" level=debug msg="container \"f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7\" has run directory \"/run/user/1001/containers/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata\"" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T21:18:23Z" level=debug msg="Pulling image quay.io/libpod/testimage:20210610 (policy: newer)" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T21:18:23Z" level=debug msg="Attempting to pull candidate quay.io/libpod/testimage:20210610 for quay.io/libpod/testimage:20210610" time="2022-07-27T21:18:23Z" level=debug msg="Trying to access \"quay.io/libpod/testimage:20210610\"" time="2022-07-27T21:18:23Z" level=debug msg="No credentials for quay.io found" time="2022-07-27T21:18:23Z" level=debug msg="Using registries.d directory /etc/containers/registries.d for sigstore configuration" time="2022-07-27T21:18:23Z" level=debug msg=" Using \"default-docker\" configuration" time="2022-07-27T21:18:23Z" level=debug msg=" No signature storage configuration found for quay.io/libpod/testimage:20210610, using built-in default file:///home/user1/.local/share/containers/sigstore" time="2022-07-27T21:18:23Z" level=debug msg="Looking for TLS certificates and private keys in /etc/docker/certs.d/quay.io" time="2022-07-27T21:18:23Z" level=debug msg="GET https://quay.io/v2/" time="2022-07-27T21:18:23Z" level=debug msg="Ping https://quay.io/v2/ status 401" time="2022-07-27T21:18:23Z" level=debug msg="GET https://quay.io/v2/auth?scope=repository%3Alibpod%2Ftestimage%3Apull&service=quay.io" time="2022-07-27T21:18:23Z" level=debug msg="Increasing token expiration to: 60 seconds" time="2022-07-27T21:18:23Z" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/20210610" time="2022-07-27T21:18:23Z" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.list.v2+json\"" time="2022-07-27T21:18:23Z" level=debug msg="GET https://quay.io/v2/libpod/testimage/manifests/sha256:d8dc9f2a78e190963a75852ce55b926a1cf90c7d2e6d15b30b6bc43cd73a6377" time="2022-07-27T21:18:23Z" level=debug msg="Content-Type from manifest GET is \"application/vnd.docker.distribution.manifest.v2+json\"" time="2022-07-27T21:18:23Z" level=debug msg="Skipping pull candidate quay.io/libpod/testimage:20210610 as the image is not newer (pull policy newer)" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2022-07-27T21:18:23Z" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2022-07-27T21:18:23Z" level=debug msg="using systemd mode: false" time="2022-07-27T21:18:23Z" level=debug msg="adding container to pod httpd1_pod" time="2022-07-27T21:18:23Z" level=debug msg="setting container name httpd1_pod-httpd1" time="2022-07-27T21:18:23Z" level=debug msg="Loading default seccomp profile" time="2022-07-27T21:18:23Z" level=info msg="Sysctl net.ipv4.ping_group_range=0 0 ignored in containers.conf, since Network Namespace set to host" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /proc" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /dev" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /dev/pts" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /dev/mqueue" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /sys" time="2022-07-27T21:18:23Z" level=debug msg="Adding mount /sys/fs/cgroup" time="2022-07-27T21:18:23Z" level=debug msg="Allocated lock 2 for container adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea" time="2022-07-27T21:18:23Z" level=debug msg="parsed reference into \"[overlay@/home/user1/.local/share/containers/storage+/run/user/1001/containers]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2022-07-27T21:18:23Z" level=debug msg="created container \"adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea\"" time="2022-07-27T21:18:23Z" level=debug msg="container \"adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea\" has work directory \"/home/user1/.local/share/containers/storage/overlay-containers/adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea/userdata\"" time="2022-07-27T21:18:23Z" level=debug msg="container \"adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea\" has run directory \"/run/user/1001/containers/overlay-containers/adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea/userdata\"" time="2022-07-27T21:18:23Z" level=debug msg="Strongconnecting node adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea" time="2022-07-27T21:18:23Z" level=debug msg="Pushed adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea onto stack" time="2022-07-27T21:18:23Z" level=debug msg="Recursing to successor node f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:23Z" level=debug msg="Strongconnecting node f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:23Z" level=debug msg="Pushed f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 onto stack" time="2022-07-27T21:18:23Z" level=debug msg="Finishing node f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7. Popped f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 off stack" time="2022-07-27T21:18:23Z" level=debug msg="Finishing node adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea. Popped adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea off stack" time="2022-07-27T21:18:23Z" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2022-07-27T21:18:23Z" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T21:18:23Z" level=debug msg="Cached value indicated that overlay is supported" time="2022-07-27T21:18:23Z" level=debug msg="Cached value indicated that metacopy is not being used" time="2022-07-27T21:18:23Z" level=debug msg="backingFs=btrfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2022-07-27T21:18:23Z" level=debug msg="overlay: mount_data=,lowerdir=/home/user1/.local/share/containers/storage/overlay/l/3AVDBV37GOLTGQJ5ENNMZYHAVT,upperdir=/home/user1/.local/share/containers/storage/overlay/98888254c913c982f0c162e551acb9c9875f840ae4310bc2eb3e901f36772ebd/diff,workdir=/home/user1/.local/share/containers/storage/overlay/98888254c913c982f0c162e551acb9c9875f840ae4310bc2eb3e901f36772ebd/work,userxattr,context=\"system_u:object_r:container_file_t:s0:c320,c862\"" time="2022-07-27T21:18:23Z" level=debug msg="mounted container \"f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7\" at \"/home/user1/.local/share/containers/storage/overlay/98888254c913c982f0c162e551acb9c9875f840ae4310bc2eb3e901f36772ebd/merged\"" time="2022-07-27T21:18:23Z" level=debug msg="Created root filesystem for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 at /home/user1/.local/share/containers/storage/overlay/98888254c913c982f0c162e551acb9c9875f840ae4310bc2eb3e901f36772ebd/merged" time="2022-07-27T21:18:23Z" level=debug msg="Made network namespace at /run/user/1001/netns/cni-14f026ac-098b-665c-9c25-3f5620a2b5aa for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:23Z" level=debug msg="slirp4netns command: /usr/bin/slirp4netns --disable-host-loopback --mtu=65520 --enable-sandbox --enable-seccomp -c -e 3 -r 4 --netns-type=path /run/user/1001/netns/cni-14f026ac-098b-665c-9c25-3f5620a2b5aa tap0" time="2022-07-27T21:18:23Z" level=debug msg="rootlessport: time=\"2022-07-27T21:18:23Z\" level=info msg=\"starting parent driver\"\n" time="2022-07-27T21:18:23Z" level=debug msg="rootlessport: time=\"2022-07-27T21:18:23Z\" level=info msg=\"opaque=map[builtin.readypipepath:/run/user/1001/libpod/tmp/rootlessport316137360/.bp-ready.pipe builtin.socketpath:/run/user/1001/libpod/tmp/rootlessport316137360/.bp.sock]\"\ntime=\"2022-07-27T21:18:23Z\" level=info msg=\"starting child driver in child netns (\\\"/proc/self/exe\\\" [containers-rootlessport-child])\"\n" time="2022-07-27T21:18:23Z" level=debug msg="rootlessport: time=\"2022-07-27T21:18:23Z\" level=info msg=\"waiting for initComplete\"\n" time="2022-07-27T21:18:23Z" level=debug msg="rootlessport: time=\"2022-07-27T21:18:23Z\" level=info msg=\"initComplete is closed; parent and child established the communication channel\"\ntime=\"2022-07-27T21:18:23Z\" level=info msg=\"exposing ports [{8080 80 tcp }]\"\n" time="2022-07-27T21:18:23Z" level=debug msg="rootlessport is ready" time="2022-07-27T21:18:23Z" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2022-07-27T21:18:23Z" level=debug msg="Setting CGroups for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 to user.slice:libpod:f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:23Z" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2022-07-27T21:18:23Z" level=debug msg="Workdir \"/\" resolved to host path \"/home/user1/.local/share/containers/storage/overlay/98888254c913c982f0c162e551acb9c9875f840ae4310bc2eb3e901f36772ebd/merged\"" time="2022-07-27T21:18:24Z" level=debug msg="Created OCI spec for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 at /home/user1/.local/share/containers/storage/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata/config.json" time="2022-07-27T21:18:24Z" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2022-07-27T21:18:24Z" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 -u f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 -r /usr/bin/crun -b /home/user1/.local/share/containers/storage/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata -p /run/user/1001/containers/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata/pidfile -n 70f0cc0b35c0-infra --exit-dir /run/user/1001/libpod/tmp/exits --full-attach -s -l journald --log-level debug --syslog --conmon-pidfile /run/user/1001/containers/overlay-containers/f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /home/user1/.local/share/containers/storage --exit-command-arg --runroot --exit-command-arg /run/user/1001/containers --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg systemd --exit-command-arg --tmpdir --exit-command-arg /run/user/1001/libpod/tmp --exit-command-arg --runtime --exit-command-arg crun --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --events-backend --exit-command-arg journald --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7]" time="2022-07-27T21:18:24Z" level=info msg="Running conmon under slice user.slice and unitName libpod-conmon-f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7.scope" [conmon:d]: failed to write to /proc/self/oom_score_adj: Permission denied time="2022-07-27T21:18:24Z" level=debug msg="Received: -1" /usr/bin/crun: symbol lookup error: /usr/bin/crun: undefined symbol: criu_join_ns_add time="2022-07-27T21:18:24Z" level=error msg="Error removing container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7 from runtime after creation failed" time="2022-07-27T21:18:24Z" level=debug msg="Cleaning up container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:24Z" level=debug msg="Tearing down network namespace at /run/user/1001/netns/cni-14f026ac-098b-665c-9c25-3f5620a2b5aa for container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7" time="2022-07-27T21:18:24Z" level=debug msg="unmounted container \"f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7\"" a container exists with the same name ("httpd1") as the pod in your YAML file; changing pod name to httpd1_pod error starting container f88296887315ca9449767e4c2b035127fc055252774a6e0ce1e97ee8ca99e6f7: /usr/bin/crun: symbol lookup error: /usr/bin/crun: undefined symbol: criu_join_ns_add: OCI runtime error error starting container adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea: a dependency of container adf8826465046a1bb2795de54212086fe5525f4b4f05615e3fa7996b2cc916ea failed to start: container state improper Error: failed to start 2 containers TASK [Clean up storage.conf] *************************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:346 Wednesday 27 July 2022 21:18:24 +0000 (0:00:01.739) 0:00:41.592 ******** changed: [/cache/fedora-35.qcow2.snap] => { "changed": true, "path": "/etc/containers/storage.conf", "state": "absent" } TASK [Clean up host directories] *********************************************** task path: /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:353 Wednesday 27 July 2022 21:18:24 +0000 (0:00:00.398) 0:00:41.991 ******** changed: [/cache/fedora-35.qcow2.snap] => (item=httpd1) => { "ansible_loop_var": "item", "changed": true, "item": "httpd1", "path": "/tmp/httpd1", "state": "absent" } changed: [/cache/fedora-35.qcow2.snap] => (item=httpd2) => { "ansible_loop_var": "item", "changed": true, "item": "httpd2", "path": "/tmp/httpd2", "state": "absent" } changed: [/cache/fedora-35.qcow2.snap] => (item=httpd3) => { "ansible_loop_var": "item", "changed": true, "item": "httpd3", "path": "/tmp/httpd3", "state": "absent" } PLAY RECAP ********************************************************************* /cache/fedora-35.qcow2.snap : ok=54 changed=24 unreachable=0 failed=1 skipped=30 rescued=0 ignored=0 Wednesday 27 July 2022 21:18:25 +0000 (0:00:01.119) 0:00:43.111 ******** =============================================================================== fedora.linux_system_roles.firewall : Install firewalld ------------------ 4.13s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:7 Install podman from updates-testing ------------------------------------- 3.64s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:139 ----------------------------- fedora.linux_system_roles.selinux : Install SELinux tool semanage ------- 2.43s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:28 fedora.linux_system_roles.podman : Ensure required packages are installed --- 2.28s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Create data files ------------------------------------------------------- 2.03s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:171 ----------------------------- fedora.linux_system_roles.podman : Ensure container images are present --- 1.95s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:35 fedora.linux_system_roles.selinux : Install SELinux python3 tools ------- 1.92s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:15 fedora.linux_system_roles.firewall : Install python3-firewall ----------- 1.84s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/firewalld.yml:18 fedora.linux_system_roles.podman : Update containers/pods --------------- 1.74s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:78 fedora.linux_system_roles.selinux : Set an SELinux label on a port ------ 1.64s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/main.yml:86 Create host directories for data ---------------------------------------- 1.25s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:163 ----------------------------- Gathering Facts --------------------------------------------------------- 1.18s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:3 ------------------------------- Clean up host directories ----------------------------------------------- 1.12s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:353 ----------------------------- fedora.linux_system_roles.firewall : Enable and start firewalld service --- 0.97s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/firewall/tasks/main.yml:3 Enable podman copr ------------------------------------------------------ 0.93s /tmp/tmp1te44sz5/tests/podman/tests_basic.yml:136 ----------------------------- fedora.linux_system_roles.podman : Create host directories -------------- 0.81s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:24 fedora.linux_system_roles.selinux : refresh facts ----------------------- 0.79s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/selinux/tasks/set_facts_packages.yml:23 fedora.linux_system_roles.podman : Update system container config file --- 0.75s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:21 fedora.linux_system_roles.podman : Ensure kubernetes yaml files are present --- 0.74s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_kube_spec.yml:68 fedora.linux_system_roles.podman : Update system registries config file --- 0.71s /tmp/tmpkx6j_hnm/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:21